Category Archives: featured

My newest game: PassTheGun.com

It’s been out a few weeks already, but I thought I would make a blog post about it.

Pass the Gun is my latest game. The entire game is played on Twitter. Users shoot and/or pass a virtual “gun” to friends on twitter. Users can earn virtual money by pulling the trigger on the gun (also commanded via tweet). However, if the gun goes off then the user is penalized in score. Users can also pass the gun to friends. Every action is recorded and tweeted in the Pass the Gun user @passthegun. People can use the website passthegun.com to track the guns history and their scores.

  1. When the gun is passed to you, you can either pull the trigger or pass the gun onto someone else.
  2. To pull the trigger, tweet @passthegun pull. If the current chamber isn’t loaded, you will survive and some danger money will be added to your score!
  3. To pass the gun onto someone else, tweet @passthegun pass @theirusername. Once they have the gun, it’s their turn to play, and you may no longer act until you receive a gun later on.
  4. The more times you pull the trigger, the greater the risk there is of shooting yourself in the head.
  5. If you shoot yourself (or wait too long!), you lose half your bank.

The design is still ongoing as people play (i.e. the infamous and indefinite beta stage); however, there’s no way to anticipate anything unless the game is iteratively released and changed. But I’m very happy with how it’s turned out so far. Tim Halbert and I have worked out the design and flow of the game.

I am really excited to see how/if it grows. The entire game is based upon social interaction and viralness; it also uses a medium I believe that hasn’t been tapped.

Tim and I were astonished to find that there are other Twitter games out there; however, what separates ours from the rest is that it takes absolutely no registration to play. You simply tweet the instructions once you have the gun and you’ll receive a @reply (and a direct message if you are following the user @passthegun).

I find this medium (Twitter) very interesting for games. Twitter offers so many affordances that mediums such as Facebook, MySpace, or other portals lack. Every single tweet (from a user with his or her settings set to unprotected) is accessible to the entire world. I don’t think people realize how big of a deal that is and how that separates their “platform” from others.

I think Pass the Gun has a great chance of succeeding because:

  • It’s on a platform with a simple and established API
  • Similar to Facebook, Twitter handles all of the registration, security of tweets, and notification mechanisms. Yet it does so in a way that isn’t intrusive and also easy to stop (if anyone doesn’t want to play Pass the Gun they can simply block the user @passthegun)
  • Anyone can play without having to opt in (i.e. Facebook). All it takes is for someone to pass you the gun!
  • The way the game is set up right now is that only players who have played before will receive a new gun when a new gun is spawned (new guns are spawned when a gun is fired). I think this “invitation” only has lots of potential.
  • The game can scale. As soon as my server can’t handle the number of requests that it takes to check Twitter’s servers every 20 seconds (which is the time interval of check right now, which if I had more servers I could easily decrease), I can just use another server to check.
  • The game is simple! The fiction is relevant (Russian Roulette) to normal users.
  • Guns automatically go off after 24 hours of a user receiving the gun. So inactive users won’t stop the game from playing.

Tim and I thought about this design quite a bit before launching and hopefully handle a lot of issues that could happen. Let me know you want a gun by sending me a direct message to my twitter.

Flash Artistic Experiment: is feeling good #1

Throughout the semester I developed this Flash art/simulation/game. It was originally supposed to be a game… maybe it is I don’t know it all depends on your definition. The title is called Is Feeling Good. I reserved the name IsFeelingGood.com a few months ago and thought for a while of what could possibly be on the site. )I have a basic idea of what the site will have and will post more about it later).

Anyway, the following idea stems (literally) from procedurally created experiences. The basic instructions are:

1) Click and hold to collect particles to form a rain cloud.

2) Release to rain and grow plant on right.

If you give plants enough water they will finish growing and move to the left. If you don’t water the actively ungrown plant the plant will wither. If you collect too much water then your cloud will begin to leak. Hold even more and your cloud will turn to lightning and shoot down the active plant. There are also a bunch of other subtle things that I included that I’ll let you guess how works.

The particles come from the left in tune to the music (I use Flash 10’s sound extract() function to procedurally check the sound volume change). The trees are procedurally created and I’m happy to say that the code is highly optimized with a bunch of tricks that I used to ensure that the tree grows quickly despite a bunch of things going on screen at the same time. While creating procedurally generated trees ended up being much easier than I thought, refining the trees ended up being extremely difficult. Making a tree look like a tree is difficult to do, and I spent more time messing with the variables than actually coding it. The flash uses nearly all bitmap graphics (that’s why so many things are on the screen at the same time).

Remember, you need Flash 10 to view.

Click here to view.

Hope you enjoy!

Soundpen

For my experimental digital art class at Georgia Tech, I created this toy I call SoundPen. It’s a very basic version obviously (although I think some cool things could come from this idea).

The basic idea is to create music by clicking your mouse and placing balls that will bounce of the screen. There are 3 different types of balls (click and press either A, S, or D).

I used Jeff Swartz’s algorithm to change pitch of the sounds from the vertical balls. The more left the ball is on the screen the lower the pitch while the more right on the screen the higher the pitch.

Please be careful when using. If you place too many balls your computer might slow down rapidly. If you want to remove the last ball just press the backspace.

There are probably ways I could have fixed that problem (for example, using lower level PixelBender capabilities to process the sound); however, I don’t think it matters too much. You don’t want to put too many balls anyway because you’ll just hear noise.

(Be sure you have Flash 10!)

Picture of Pictures

For my Experimental Digital Media class at Georgia Tech.

Click here to view
(Must have Flash Player 10)

The Internet has provided a new means of data expression. I decided to play with the idea of machine aesthetics by developing an application that creates images of images.

I first saw this effect a long time ago on a poster advertisement for the Truman show. I always wondered how they made the effect of compiling images together to form, when looked at a certain distance, an image.  The program I made this week creates the effect by downloading Flickr images.

The algorithm is very straightforward (I also mention some ideas for expansion on the Flash). Basically the program downloads Flickr images either by the most recent, based on a search term, or from a username’s public photos via the as3flickrlib. While downloading, the application calculates and caches each photos average RGB color average (the main reason why my algorithm is fast). The program, based on the user’s supplied number of rows and columns, calculates the average RGB color of each divided cell and finds the image with the average color that has the closest 3D distance to the cell’s average color. The effect is pretty cool. I offer the ability to have a very high resolution (which basically expands the original image before running the algorithm to near the max of Flash Player 10’s bitmapData’s limit). The picture effect is  best seen with the highest resolution.

Lastly, thanks to Flash Player 10, saving images is added. I use the JPGEncoder from as3corelib. Unfortunately, the JPGEncoder was too slow as its algorithm is synchronous. Upon Googling “Asynchronous JPG Encoder,” I found a cool Asynchrnous JPG Encoder class by SwitchOnTheCode.com. While I have crtiticisms on the class’s asynchronous technique choice, the class does what it says it does.

Anyway, if there is enough support, I may release the class as open source on Google Code. I think there are a lot of ways to alter the algorithm to produce different kinds of image results.

Scoring The Oregon Trail

Introduction

A unit of measurement is defined as “any division of quantity accepted as a standard of measurement or exchange.”[1] Units of measurement are critically important in associating meaning with quantities. One gallon of milk, one-hundred yards on a football field, or a thousand pages in a book are examples of a numerical multiplicity of units that have definite associations with their respective mediums. People identify with a gallon, yard, or page because each unit holds concrete meaning and magnitude.

In the context of video games, measurements of success are determined by evaluations of player experiences and presented as quantifiable units. Unfortunately, the units used in video games are often times simply “points” that are calculated from multiple dimensions of mathematical operations and offer little or no verisimilitude to the fiction in the game. The educational Apple II game The Oregon Trail, played in the late 1980s and early 1990s, while offering fictionally relevant death-screens, exemplifies the mistake of using incoherent and abstract “points” to evaluate the success of the player.

In this essay I will first briefly mention the method of score calculation and evaluation by non-digital games and the significance of their choices of score computation and units. Afterwards, I offer my explanation of why video games tend to stray away from the mechanics of score computation offered in non-digital games. I then introduce the Oregon Trail and the game design choices that relate to the pioneer life of the 1840s and 1850s. Next, I outline the problems of the evaluation mechanism of the Oregon Trail and how it regrettably detaches itself from the fiction. Lastly, I propose an alternative means of player assessment in the Oregon Trail and its benefits.

Part 1: Scoring without Computers

There are people who have careers in studying sports’ statistics. These statisticians predict un-played games via the analysis of played ones. The sheer quantity of logged statistics is overwhelming. In addition to points, basketball has a multitude of statistics for teams and players such as assists, rebounds, blocked shots, and turnovers. Baseball statistics consist of pitch count, balls, strikes, bunts, errors, batting average, earned-run-average, on-base-percentage, and the list goes on. However, while all these numbers are tracked, the only numbers that matter are runs, goals, or points. If a team has the most runs, goals, or points in a game then they are declared the winner. In these non-computers based games, the way of incrementing the unit of score of is typically single or low dimensional. For soccer and hockey, this causation is passing the ball/puck past the goal line to score a goal. For baseball the causation is stepping on the plate to earn a run.

Occam’s razor applies in scoring in non-digital games. Giving bonus points to teams that score a bicycle kick or subtracting points for missed questions in a trivia game would add unnecessary complexity. In the context of score culture, players discuss or use secondary units of metrics in games for comparison. Obviously, there is a direct proportion to secondary statistics to primary statistics. Baseball teams with better batting averages tend to win more games. Sports fans use secondary statistics as reasons why their favorite team is better than the other. But those interpretations are always subjective and speculative. At the end of the day, the primary statistic is what matters. Who cares if the team’s batting average is the highest in their division if they aren’t making the playoffs? As the late German soccer coach Sepp Herberger famously told his team, “The ball is round, the game lasts 90 minutes, everything else is… theory.”[2] There is no justice in giving extra points to teams with higher batting averages either. Games that apply weight to specific secondary statistics risk ruining the balance of the game and cultivate an unanticipated culture of community assessment.

Part 2: Cultural Effects of Assessments

Evaluation is an embedded part of our lives. Capitalism, by definition, encourages competition and evaluation. Exams in schools rank our performance with tests and assignments. Importantly, test grades are often calculated with a single dimensional computation: percent of questions answered correctly.

More important, especially in relation to video games, is the concept of score inflation and deflation. While a test grade of C implies average, a class full of D students may consider a C above average. In games the same process applies. Both game designers and teachers attempt to predict average results of a game in order to assess player experiences. However, game designers that fail to predict accurately force players into creating community-based assessments. Scoring five goals in a soccer game is considered abnormally high because of the outcome of previous games. Game designers don’t have the luxury of previous gameplay data; consequently, most video games don’t have built-in interpretations of the players’ experience. Comparisons are often absent in video games and the gaming community itself turns the scores into meaning. Interpretations are usually best left to the community; games sometimes catalyze community comparisons with scoreboards. “You’ve made the high scores!” is the typical embedded evaluation of a player’s experience. Yet, what typically happens is that the community ignores the points and focuses on the experiences themselves as sources of evaluation. This phenomenon is evident in many action games via conversations players have with one another in society. “I just beat the second boss!” or “Oh yeah? I reached the third boss and killed him with only a knife!” The experiences become the metrics, and the units of measurement used in the game are disregarded.

Most of the time failure is attributable to the incoherence between the metric and the fiction. “Points” is a commonly used yet linguistically abstract term with little relevance to the fictions it represents.

These points are often calculated in a meaningless way. Games tend to have a multitude of methods in computing score which have little or no coupling to the player’s experience. In Pac-Man the player amasses points throughout the levels by eating ghosts and collecting dots; however, the Pac-Man competitive culture became more about “What fruit did you get to?”; a reference to the various types of fruits the Pac-man encounters in later levels[3]. The fruit became more significant to the players than the points because the fruit had a tighter coupling to the player’s experience.

Besides encouraging replay and competition, one of the reasons video games have propensity to add loosely-coupled, computationally complex scoring components is the obvious ability of the computer to process and store information. Non-computers based games left storage upon the players rather than on the system itself. The popular game of Tic-Tac-Toe is a perfect example. What if Tic-Tac-Toe had a rule that if a player waits more than 3 seconds on their turn, then the other player is declared the winner? One can only imagine the arguments players would have over the amount of time a player waited. The choice of leaving this rule out is not coincidental. Requiring the use of a stopwatch during play would be ridiculous. However, the computer provides many affordances for games; one of those affordances is the ability to track time length. Consequently, this additional rule can be added to the game without destroying the physical experience of playing the game. But in game design, the consensus has generally pointed toward justice and balance rather than coherence; consequently, multiple added rules can deter the meaningfulness of the score.

For instance: a player may kill fewer 100 aliens in a first person shooter but could receive a score of 3104 depending on the number of headshots, amount of ammunition used, etc. Unfortunately, while this approach may be seem competitively fairer, the unfortunate fact is that players have a difficult time understanding their score during gameplay when score computations are multidimensional. Only the computer, due to its procedural nature, can keep track of the scores.

Another affordance of computers over humans is the abilities of processing and memory. Humans playing multidimensional scoring games rarely know how much their score will increase because of human mind limitations in determining rationality of the numbers. Games that dispense thousands or even millions of points for random achievements are easy examples of games in which the points’ computation lose meaning. Unfortunately, games that fail to offer a coupled, low-dimensional scoring mechanism risk ruining the player’s ability to improve their performance. A player is expected to master a game after receiving feedback and modifying future game decisions based on that feedback. Yet, multi-dimensional feedback is nearly always more bias due to the selections of the weight of each attribute in the score formula. Score justice is a consequently unattainable achievement due to score bias. Especially for fiction games, implementing every possible parameter into a final score is typically an impossible task; consequently, designers must choose specific attributes they deem important. This process of selection creates the score bias. Additionally, score justice is circumvented via “score loopholes”; players may find strategies of racking a high score by performing a specific gameplay process repeatedly. An example of a score loophole is found in the 1985 MECC game The Oregon Trail which I discuss later. The Oregon Trail is an example of a game that attempts to justifiably evaluate player performance with a confusing, zero-coupled, multidimensional scoring computation.

Part 3: The Oregon Trail (1985)

The Oregon Trail, originally conceived in 1971 and produced by MECC in 1974 before released to public in 1985, is a heavily fiction based game about pioneer life on the real Oregon Trail in 1848. The player undertakes the role of a banker, farmer, or carpenter that leads a family across North America on the Oregon Trail. Starting in Independence, Missouri, where most pioneers began their migration[4], the player manages food, clothing, oxen, money, and hunting bullets throughout a long and eventful voyage to Oregon.

The Oregon Trail has tight couplings with its 1850s fiction during most of the playing experience. Nearly every event and action has coupling to the fiction presented in the game; consequently, the game became a very successful teaching tool in elementary schools[5]. Some events include members of the wagon party obtaining various sicknesses or injuries or thieves randomly come at night to steal supplies; these events were faced by travelers along the real Oregon Trail[6]. Choices are also relevant: at the cost of health, food rationings and wagon pace can be modified in dealing with shortage of supplies. Even the character death results are particularly verisimilar. In addition to the removal of the wagon member from the group, the player has the option of placing custom engraved tombstones at the place of the death. Later players will have the option of viewing these tombstones on their own journeys.

However, strangely enough, The Oregon Trail fails miserably in the win-state department. While most explanations or outcomes of the game are relevant toward the life of the pioneers in 1848, the final screen simply shows a numeric score; the result of a series of irrational mathematical operations.

From Wikipedia: “Points are awarded according to a formula weighted by the profession chosen (points are doubled for a carpenter and tripled for a farmer), the number and health of surviving family members, remaining possessions, and cash on hand.”

Surprisingly, the only difference between vocations in The Oregon Trail is the bonus multiplier associated at the end of the game. Instead, the game should have offered different affordances for each role. A banker should have better bargaining skills in buying items and trading, farmers should be able to keep the oxen alive longer and have better hunting skills (in The Oregon Trail hunting mini-game the character is unable to carry more than 100 pounds of killed animals back to the wagon), and carpenters should be able to repair broken parts more quickly than the other two roles. Instead, The Oregon Trail chose to simply alter the amount of starting money for each role and change the bonus multiplier. The Oregon Trail lazily uses vocation selection as a “difficulty mode” selection.

The scoring’s multidimensional complexity, in addition to confusion, affixes gameplay bias. The game says that an “Odysseus” pioneer with more food at the end of the trail is ranked higher than a pioneer with less food but more surviving family members. The Oregon Trail argues that a wagon leader’s goal is to balance supplies and money by the end of the trail… and to also be a farmer. The Oregon Trail subjectively applies value to specific vocations and decisions instead of linking the performance to the pioneer narrative.

Part 4: Narrative Evaluation Alternatives

A better alternative to the irrelevant scoring system used in The Oregon Trail is revealing the player’s family/party upon outcome after settling in Oregon. Instead of attaching a number to the player’s performance, simulate the post-journey result of the family. Historically, settlers arriving in Oregon sent letters east to other families and friends describing their happiness and state in Oregon[7]. In replacement of showing score calculation, the game could display a letter sent by the player’s family to a fictional family or friend back home in Independence, Missouri describing their state in Oregon. Depending on the resulting supplies and cash, the letter is written with a different tone and result.

Money is dispersed at the beginning of the game but not earned through the adventure. Sequential checkpoints increase the price of supplies; consequently, players stock supplies early on to save money. If the player has no money at the end of the trip, the player’s letter home can contain information regarding how “money has been tight” and that their house is “small” and their kids attend “poor” schools. The more money the player saves the better the schools and larger the land they own. If the player keeps less clothing the letter can read how winters have been tough. If they kept oxen the letter can write how they’ve been able to use their oxen to travel to town to buy supplies such as clothing to manage with the winter. Depending on the vocation of the player the letter can read differently as well. If the player is a banker, depending on the other variables he or she has found a specifically ranked job from “unemployed” to “President of a National Bank” (Congress passed (1863) the National Bank Act, which provided for a system of banks to be chartered by the federal government). The Oregon Territory was acquired in 1848, the year the player in the game begins the journey and Oregon became part of the United States in 1859. Farmers could write in the letter their poor or strong harvests. Carpenters can also become architects with varying positions.

This custom “alternative ending” approach would help fix the deficient narrative of the end game. Pioneers in the 1850s journeyed to Oregon for a better life[7]. Integrating those dreams and aspirations with the game adds agency. In reality, all the attributes and parameters of the trip are still being incorporated into the final assessment of the player’s experience; however, instead of showing a number, the game shows a narrative in the form of a letter.

In relation to the competitive aspect of the game, narratives do not provide the ability for unbiased comparison. The Oregon Trail from 1985 has a high score table feature to rank players according their numeric score. An overall rating is given to each player such as “greenhorn”, “adventurer”, or “travel guide”; a value determined by the score. Since my
“letters back home” proposition employs narratives that can only be subjectively evaluated, a ranking system based on those narratives is unfeasible as different players may view varying narratives as more “successful” than others depending on their personal definitions of success and failure. However, instead of eliminating the “Oregon Trail Top 10” scoreboard, the scores should be replaced to a simple number of days taken to reach Oregon. There is already a direct proportion in days taken to reach Oregon and the numerical score. The better the player manages their supplies and health, the faster the player will reach Oregon. In fact, there is a more direct proportion to the management of the entire experience in days taken than there is in the current formula for score calculation. The current scoring formula only applies to the result of the trip rather than the progress. For instance, suppose a player travels the trail with “good” health. Consider if player’s health at the very end of the trip, right before reaching Oregon, falls to “fair.” The player’s score will be comparatively lower than that of a “good” health ending player even if the “good” health ending play held “fair” health through entire trip. The Oregon Trail, as with all games with multidimensional scoring systems, suffers from these unexpected “score loopholes.” The narrative endings solve the problem of point bewilderment and loose-coupling and as well as encourages varying gameplay experiences to see different endings.

Summary

Evaluations tend to be quantitative instead of qualitative. Consequently, their evaluations are often ignored by players. Using abstract and extraneous formulas and bonuses damage coherence, agency, and ultimately immersion. To retain verisimilitude in player experiences, performance evaluations necessitate low dimensionality and fictional relevancy.

The following is a prototype of an example end screen letter a player may see. Modify the inputs to see varying outcomes.

Bibliography

1.    P. University, “WordNet Search Dictionary,” Book WordNet Search Dictionary, Series WordNet Search Dictionary, ed., Editor ed.^eds., Princeton University, 2006, pp.

2.    “Sepp Herberger: Biography,” http://www.fifa.com/classicfootball/coaches/coach=61547/bio.html.

3.    “Pac-Mac: Classics Reunited,” http://www.classicgaming.cc/classicS/pacman/playguide.php.

4.    M. Trinklein, and S. Boettcher, “Independence,” http://www.isu.edu/~trinmich/independence.html.

5.    W. Jolley, C. Fujiyama, S. Alami, and L. O’Neal, “The Trail as a Teaching Tool,” 2003; http://web.wm.edu/amst/370/2005F/sp1/Teaching_the_trail.htm.

6.    M. Trinklein, and S. Boettcher, “The Oregon Trail: Hardships,” http://www.isu.edu/~trinmich/Hardships.html.

7.    M. Beaver, “The Oregon Trail,” 2001; http://www.scsc.k12.ar.us/2001Outwest/PacificEcology/Projects/BeaverM/.

 

Ethics and Online Privacy

Online, privacy barely exists. Every page, email, and instant message can be intercepted, manipulated, and/or logged. The millions of Internet users browsing the web at this very moment may be surprised to find that their online information is not secure. The Internet’s greatest strength in accessibility is its greatest weakness in security. Web servers have the ability to store information about every visitor. In order to provide services to their customers, websites must store information on their customer’s machines as well as on their databases. Collecting data is an essential function of web applications; unfortunately, the majority of data collection practices used today is unethical because their users are not informed properly of what, how, and why their information is being gathered.

Digital data is virtual; ironically, the information does not exist anywhere other than through bits and bytes stored electronically. Unlike a letters that exist on a physical sheet of paper, digital media can be transmitted, duplicated, or modified in microseconds. Online data has the same characteristics. One analogy to explain how the web works with data is by a boy and his father playing catch with a baseball. The child, the client, throws a baseball to the father, the web server. The father (web server) catches the baseball. On the baseball there is writing that the client child wrote that the server child can read. The server child then erases the writings on the baseball and responds to the client’s writings with its own and throws the ball back. Unfortunately, the boy is very young and illiterate and must have his mother (the web browser) interpret what the father wrote. For example, the boy plays catch with CNN.com and throws a baseball that says “Give me CNN.com’s homepage file” (which his mother wrote) to CNN.com. CNN.com reads the message and writes the HTML file on the baseball and throws it back to the boy. The boy catches the baseball and asks his mother to read it for him. The mother (web browser) checks the file to make sure it does not have any malicious “writings” (code) and then reads it to the boy. The mother can also remember data (cookies) for the boy that the Dad wants the boy to write down on the next baseball.

This analogy may seem odd; however, it is a way of understanding how the Internet works. The baseball represents the data being passed back and forth. Unfortunately, that baseball can be “intercepted” on its way from one direction to another. If the writings on the baseball are not encrypted then that person reading the baseball will have access to that data. Additionally, there is nothing stopping the father (the web server) from transmitting the boy’s data to some other person.

Web servers need to use the pieces of data stored by “cookies” to deliver products; however, the way that web servers use cookies pose significant ethical concerns. Cookies are a form of invisible data gathering; most users have no idea that cookies are being stored on their machine. The use of web browser cookies by websites is ethical and essential to the web. The problem cookies pose is the ability for them to be misused and abused. Most websites store enough information to isolate individuals. That cookie data has the potential to be compromised if it is stored on users’ PC in unencrypted form. Any user of that PC can read that data. Websites should be designed to let users know what and how that information is being stored as well as use encryption to protect data. There is a standard currently released on the web called a Privacy Policy document. The Privacy Policy is a detailed description of what information a website is collecting. Currently in the U.S.A., only websites that target or knowingly collect information from children under the age of 13 must have a Privacy Policy document posted on their website. This law, known as Children’s Online Privacy Protection Act (COPPA), requires users under the age of 13 to obtain parent approval before registering with a website. While this act is well-intended, most websites (especially small ones) do not have the resources to verify parent signatures.

There are legitimate counter arguments to enforcing Privacy Policies documents. The simple enforcement of ensuring what Privacy Policies documents say and what the website actually collects is nearly impossible with the vast amounts of websites in operation. A second problem is that the actual regulation is impossible as the government does not have the resources to verify that the web server does not or does store information listed on a privacy policy. Lastly and most importantly, very few users actually read privacy policies on websites. A study done at Carnegie Mellon University [1] finds that privacy policies in U.S. sites are on average 2,500 words and takes on average 10 minutes to read (thus costing billions of dollars per year in opportunity cost). The study concludes that because Privacy Policies documents take so long to read, and are difficult to understand, most Internet users ignore them.

While cookies are stored on my machine, data I enter on web forms are stored on web servers. Since Privacy Policy documents are so rarely posted, followed, or read, how can I be assured that my credit card information I entered in a web form to purchase a product won’t be kept by the webmaster? There is no way for me to know how long it’s stored and who has access to view it. Credit card and social security numbers are examples of sensitive information that a criminals, rather than companies, seek. Identity theft is so rampant [2] in the United States that 221 billion dollars is lost by business every year. Identity theft crime has hurt the online economy according to a survey done of online shoppers by Harris Interactive for Privacy & American Business and Deloitte & Touche LLP [3]. 64% of respondents from their survey have decided not to purchase a product from an online company because they weren’t sure how their information would be used.

Currently, solutions are being offered via the use of noteworthy and famous third party vendors such as PayPal; however, many websites choose to store credit card information themselves. Unfortunately, these sites are often unprotected from hackers and criminals seeking to steal the identity of one of their customers. An ethical solution is to have a government regulated list of authorized transaction vendors (like PayPal or Google Checkout) that online transactions must use. The use of any private system should be illegal unless it is on the government’s list of approved transaction middlemen.

While cookies are an important part of online privacy, a report [4] concerning privacy in the European Union mentions that protecting personal data from intrusion is not the only part of protecting privacy. Legaresi reports that “Personal data protection has absorbed most of regulatory efforts devoted to privacy, on the wrong assumption either that it coincides with privacy protection or that it has the same dignity of privacy protection. The misunderstanding of the concept of privacy has determined a devaluation of its value and a lower level of protections of some of its relevant sides, like solitude, anonymity, intimacy and personality [4].”

Legaresi is correct in his analysis of data protection versus visibility protection. Social networking websites are an example of where data could be digitally protected yet not private. Many users list their phone numbers and addresses on these websites which, unless privacy options are available and applied the social networking site, could be accessed by anyone on the social network. In the work environment, this fact is especially important. Many employees post pictures on social networking websites that may be seen as inappropriate by their employers. Tiffany Shepherd was fired from her job as a high school biology teacher after pictures of her in a bikini were found [5] on her social networking site.

I don’t think Tiffany should have been fired from her job as her pictures were not crude or in bad taste; however, I do respect the right of the school to fire a teacher they believe is poorly representing the school. A New England Patriots cheerleader was fired after she posted to Facebook.com a photo of herself at a party next to a passed out man covered in offensive markings [6]. In this example, I think that the Patriots have every right to fire her, as not only is she poorly representing the football organization, but they are a private company and should be able to fire anyone for any reason other than race, gender, religion, disability, or sexual orientation. There are arguments against firing employees without direct cause. Many believe that what they do outside of the work place is their business. Additionally, company rules are not always transparent to employees. However, private companies need this right to determine who can work in their company. For example, if a male employee had an affair with his boss’s wife, would the boss not be able to fire the male employee because the affair happened outside of work? Of course not! The boss, like all company bosses, should have the right to fire people for events happening outside of work. So referring back to the Tiffany Shepherd incident, she along with anyone else can control what their employers see by simply not posting controversial media on their profile pages.

Similar moral questions arise in public schools. Schools typically have web filters to prevent users from access certain websites. In many schools, every page a student visits, whether it is a ESPN, EBay, or Facebook, is immediately logged and reported to school administrators. While this oversight seems comparable to companies, I don’t think public schools share the same ethical standards. The difference is that employees today have the expectation of using some of their computer time for personal reasons since they often have a company email account and/or are on the computer all day. High-school students, who use computers sparingly during class for research purposes should not be using that time to send personal emails or to visit EBay.

In current practice, Social networking privacy is almost an oxymoron. On the one hand, social networking websites offer services to connect users together by sharing information. On the other hand, users prefer to restrict the sharing of information to certain parties. One solution that some social networking sites such as Facebook have implemented is privacy controls. Users (employees, students) can select which data is viewable to other users (i.e. employers, teachers). But where does the line between personal responsibility and privacy fall? Concessions need to be made on both sides. I need to realize that what I post on a social networking site is no longer private and social networking sites should, but not be obligated to, offer privacy controls. The reason sites social networking sites should not be obligated to provide privacy controls is because regulation is nearly impossible. Many argue the opposite, that social networking sites should be obligated to have visible, explicit, and easy to use privacy controls. However, the only way regulatory agencies would be able to know if users’ information is not being shared with unwanted users is by either approving website code or by monitoring user accounts. Either is made increasingly difficult as new versions of social networking sites are consistently released.

I think this problem is solving itself. Social networking sites compete for users; ones that offer more services such as privacy controls are more attractive to customers. While this capitalistic perspective may seem speculative, the online statistics website Alexa.com backs up this claim by ranking MySpace and Facebook, two social networks that offer privacy controls, as the most popular social networking sites in the United States.

Sharing personal data with third parties is a logistical privacy problem for these social networking websites. In order to show relevant advertisements to a specific user, websites analyze specific user information to show ads corresponding to their data. For example, if a user’s marital status is listed as “single” on Facebook, that user may see a web advertisement for a dating website. Or if one of the user’s favorite bands is Coldplay they might see a banner ad for a Coldplay concert. As long as these websites do not share identifiable information to the companies serving the ads and also notify the users that they are sharing his or her data with other companies, then their practice is ethical. A counter argument is that these sites should ask permission from a user. Some applications do request from the user permission to send information anonymously to a statistics service. However, requesting permission could hinder the experience of using their product. I personally think as long as a service is sending my information anonymously, the service is ethically OK. Whether or not regulation or enforcement of anonymity is possible is a different question.

Another ethical dilemma is where or not companies can sell user or users’ data to marketing companies. For instance, TV networks would love to know trends in what users are listing as their favorite TV shows. Facebook and MySpace can and do provide empirical data to companies. While many dissent this practice as their information is technically being distributed to a third party without their permission, I don’t find it morally wrong as long as the data being sent to companies is sufficiently large to support individual anonymity.

The Internet was built to help share information rather than hide it. Since websites require information to deliver information, they are ethically bound to inform their users in an explicit, non-confusing way exactly how information is being kept. There is no one solution to enforcing websites to uphold this moral standard. Protecting privacy online is a multi-faceted problem that involves both regulation and lasses-faire policies. Nevertheless, the best weapon against privacy threats is the realization of online privacy vulnerability.
Bibliography

1. N. Anderson, “Study: Reading online privacy policies could cost $365 billion a year,” 2008; http://arstechnica.com/news.ars/post/20081008-study-reading-online-privacy-policies-could-cost-365-billion-a-year.html.

2. “Identity Theft Statistics,” http://www.spamlaws.com/id-theft-statistics.html.

3. “Vague online privacy polices are harming e-commerce, new survey reports,” http://www.internetretailer.com/internet/marketing-conference/578566856-vague-online-privacy-policies-are-harming-e-commerce-new-survey-reports.html.

4. N. Lugaresi, “Principles and Regulations About Online Privacy: “Implementation Divide” and Misunderstandings in the European Union ” Book Principles and Regulations About Online Privacy: “Implementation Divide” and Misunderstandings in the European Union Series Principles and Regulations About Online Privacy: “Implementation Divide” and Misunderstandings in the European Union ed., Editor ed.^eds., 2002, pp.

5. “Tiffany Shepherd fired for wearing Bikini?,” 2008; http://www.newspostonline.com/world-news/tiffany-shepherd-fired-for-wearing-bikini-2008103111672.

6. “Patriots Cheerleader Fired over Facebook Swastika Photo,” 2008; http://www.foxnews.com/story/0,2933,448044,00.html.

Web Standards

As many Flash Developers have heard, ECMAScript 4 has died. I’m not going to go into the details of the decision, although you can most likely blame Microsoft and their 80% browser stronghold on the web.

Nevertheless, the decision has raised significant issues around the web regarding web standards, and whether or not they have worked in the past or not.

Take for example IE and Firefox. Every web developer has experienced the browser compatibility problem. 90% of forum posts start with “Strange CSS bug, works in IE but not in FF” or “Script works in FF but not in IE” etc. Ironically, Adobe Flash’s success has largely come from this problem as Flash runs ubiquitously on every browser. While the W3C has made progress on creating web standards, in general it’s been a failure. Even newer browsers, aware of the recommended standards while in development still don’t pass Acid2. But why has this failed?

I think it’s by choice. Microsoft, Firefox, Safari, and Opera fight for the larger share in browser usage. If coders create a site that fits their browser, the idea is that more users will download that browser. For a long time, IE was the standard. Whenever a site didn’t work in IE, as a developer, you MADE it work on there knowing that 90% of the visitors to your site would have IE.

But a phenomenon happened: the open source movement. Firefox arrived as an open source competitor. It’s not that Firefox was better than competitors (it wasn’t). But because of the nature of Firefox just being open source, the browser had more power than other competitors to browser dominating Internet Explorer. Firefox complies with the web rather than the web complies with Firefox. This development strategy could have only existed on an open source project, as the developers of the project did not have the authority or mindset to take over the web.

But as mentioned earlier, Firefox has led to major problems to web designers and coders. It’s a pain to create stuff! Everyone wants a page to look the same on every browser and on every computer. Only a dream… and will stay a dream.

Having standards would fix this problem; however, how would the actual products react? How will innovation be affected? There are many parallels to government oversight in the economy and web standards… but we don’t have to go into that.

A fleet only goes as fast as its slowest ship. When Mozilla, Firefox, and Apple vote on a particular feature to be a standard, some will have to pick up the pace while others will have to slow down. And that is what has been happening at the W3C and ECMA. Adobe has included a bunch of features that it would have probably had to remove in order to comply with ECMA 4. Now that ECMA 4 is dead, Adobe can keep those features… and even develop new ones without having to wait for ECMA to accept or reject revisions.

Of course, the downside is that we developers will have to work extra hard to make our products work on multiple software. Meaning more browser specific pages and more time researching what works on a particular browser and what doesn’t.

But here is where I disagree. At this level, the benefit isn’t that someone with IE may see an awesome Javascript menu while someone on FF may see a image mapped navigation. The biggest misconception I have found is that coding standards are susceptible to the same criticism as software standards (Browsers, GUIs, colors). But it ain’t.

Here’s why: the standards proposed in ECMA 4 were in coding syntax rather than in any product output. At today’s high level coding languages, we can basically do anything we want with Javascript, Flex, and Actionscript; however, the syntax is different. And THIS is something that NEEDS standardization. Just because the experience of writing the literature of source code may be better and more intuitive on one platform than another doesn’t change the actual output of the product. This is why I think the dump of the ECMA standard is a BAD thing. The ECMA was a scripting standard. So Actionscript may get private constructors now, whooped-dee-do.

What would have happened if RSS standards weren’t introduced? Sure, maybe one RSS format would have “extra features”, but we wouldn’t have the RSS readers that we do today. The Google Reader team can focus on making their product better and not on how to read hundreds of different XML formats.

I’m not saying that ECMAScript is comparable to XML; however, I am trying to address the criticsm I have been reading trying to say that web standards hinder innovation and are always a bad thing.

The reason ECMA failed was because companies want to control the market. The excuse Microsoft and others use is that, in practical terms, the web doesn’t need certain coding features. But that’s just an excuse. Web applications are getting more and more advanced. A standard like ECMA 4, which would last for many years, needs to support the most advanced coding syntax as possible. Sure maybe most products are not big enough to really benefit from abstract classes, but in 2 or 3 years, when ECMA 4 was scheduled to be adopted and shipped with products, who knows? And what about in 5-6 years?

In other words, in the short term and for companies that have the capacity to advance their languages more than others (like Adobe) , and since AS4 may come out sooner, this decision is ultimately good in the short term and bad in the long term.

AS3 EventManager Class: removeAllListeners

I heard about a class that Grant Skinner wrote called Janitor that was supposed to help keep track of listeners, but I couldn’t find it. Consequently, I wrote my own “EventManager” class for a gaming project I’m working on which keeps track of Event listeners in a project.

As all Actionscript 3 developers know, one of the biggest annoyances is keeping track of listeners and ensuring objects are collected in memory. This class does a lot of that for you, and even has a method removeAllListeners which has different filters. So let’s say some listener keeps calling a function, you can remove all listeners that point to that function. Or let’s say you want to remove ALL key listeners.

Here it is:

*EDIT 9/30/08: EventManager Updated. Click here to get it.

For example, let’s say you have something like so:

var obj:MovieClip = new MovieClip();
var obj2:MovieClip = new MovieClip();
obj.addEventListener(Event.ENTER_FRAME,Test,false,0,true);
obj = obj2; obj.removeEventListener(Event.ENTER_FRAME,Test); // Does NOTHING!

Test will still be called every frame! Even though weak reference is set to true and there are no more references to obj! This is not good! But with EventManager…

var obj:MovieClip = new MovieClip();
var obj2:MovieClip = new MovieClip();
EventManager.addEventListener(obj,Event.ENTER_FRAME,Test,false,0,true,true); // last parameter actually adds the listener, see documentation in class of why this last parameter exists.
obj = obj2; // Now we have a bunch of options, any of the bottom lines would work EventManager.removeAllListeners(null,Event.ENTER_FRAME); // will remove all Event.ENTER_FRAME listeners
EventManager.removeAllListeners(null,Event.ENTER_FRAME, Test); // will remove all Event.ENTER_FRAME listeners that call Test
EventManager.removeAllListeners(null,null, Test); // will remove all listeners that call Test EventManager.removeAllListeners(); // will remove all listeners

The only problem with the class is that this version does not distinguish useCapture events… But big deal. Maybe I’m naive, but how often does anyone actually set useCapture to true?

Anyway, if this class gets a lot of attention I’ll put it as open source on Google Code. Then someone else can add in functionality for useCapture.

This class was such a pain to code. AS3’s Dictionary is the only sure fire way to index objects as objects (as Array uses the result of toString() for indexes), but because (for some reason that’s beyond me), Adobe decided not to have a length property in Dictionary, my life was made very difficult. There might be some bugs, but based on my initial tests everything seems to be working fine. I wonder how Grant’s Janitor class is compared to mine. I really couldn’t figure out any other way to write the function definitions for EventManager.addEventListener and EventManager.removeEventListener (having actuallyAddListener and actuallyRemoveListener as the last parameter). I wish AS3 had a way to get the memory location value of an object. Something like Object.toMemoryString or something. That way Arrays could be used and the Object.toMemoryString value (which would be unique for every single object) could be used as keys.

If you decide to use this class, please let me know so I know my work hasn’t been in vain!

Jacobi Algorithm in AS3

Recently, my “Calculus for Computer Science” teacher assigned the following problem.

So to complete this assignment, I decided to use Actionscript 3 and Flash. A friend of mine and I at Georgia Tech are developing a complex open source matrix library called as3matrix, and we were planning on implementing Jacobi to find eigenvectors of a MxM matrix anyway, so we decided to just apply it to our library.

The Jacobi algorithm is pretty straight forward. It’s impossible to use a formula to find the eigenvalues of a matrix larger than 5×5 because there is no equation solver for equations to that degree. So let’s say you have an 6×6 matrix. To find the eigenvectors, the Jacobi algorithm creates a smaller 2×2 matrix inside that matrix, diagonlizes that, then reapplies it to the matrix.

So what the program I turned in does, is generates a random 5×5 matrix. The program then begins to diagnolize the Matrix using the Jacobi algorithm. Then the program attempts to try diagnolizing the Matrix using the Jacobi method but this time ignoring sorting to solve the 2×2. Obviously, sorting is much faster since it ensures the 2×2 can be diagnolized (since the corner entries will be the largest absolute value of the matrix).

So anyway, here’s how the process works in our as3Matrix library. The jacobi() method computes one interation, while diagonalize() continues the jacobi method until the Off (the sum of the square of off-diagonal elements) is less than 1e-10.

First, take the matrix A. Find the i,j element in the matrix that have the largest absolute value. Create a 2×2 matrix from the i,j elements where a = i,i ; b = i,j ; c = j,i ; d = j,j . This step was probably the hardest part because I kept mixing up the i’s and j’s! Quite annoying when you accidently flip them…

Next, take that 2×2 matrix and diagonalize it. The formula for the eigenvalues that the library uses for 2×2 matrices is:

var L1:Number = ( (a+d)/2 ) + Math.sqrt( 4*b*c + ((a-d)*(a-d)))/2;
var L2:Number = ( (a+d)/2 ) – Math.sqrt( 4*b*c + ((a-d)*(a-d)))/2;

For the eigenvectors, I use a nice trick found by Harvard professor Oliver Knill. I then normalize (which is something Oliver’s page fails to mention) the eigenvectors. Combining the eigenvectors to {u1,u2}, I now have my matrix U.I take that matrix and embed it into the identity (of size of the original, original matrix).  I call that matrix G. Then D is Transpose(G)*A*G.

Then outside of the method I check if the Off(D) is < 1e-10. If so, then I consider the Matrix diagonalized!

Here are the results of Jacobi (with sorting) vs Theoretical Bound and Jacobi (without sorting) vs Theoretical Bound. Since AS3 doesn’t have an LN function, I just used the change of base formula (log(X)/log(2)). I hard coded log(2) to optimize the code.

A couple of random 5×5 matrix sample:

After running 100 random 5×5 symmetric matrics through the Jacobi algorithms, these were the average number of iterations for each:

Average Sorting = 25.11
Average no Sorting = 102.94

Sorting is clearly the best method.

Anyway, you can browse/download the as3matrix library here. Check out the TestJacobi.as in the trunk.