Tuesday, December 21, 2010

More Homage/IP Theft

Although I already posted one example of IP theft cultural homage in Fallout: New Vegas, I found another common one so I thought I'd share it. The famous holy hand grenades (count ye to three...) from Month Python's Monty Python and the Holy Grail.


Some cultural items (or I could say cultural texts) are canonical to certain cultures, and knowledge of them serves as a cultural identifier. In the previous post, it was The Princess Bride, here it is Monty Python. References to Monty Python can be found in many geeky places, for instance there is fleshwound armor, a reference again to Monty Python and the Holy Grail, in both World of Warcraft and EverQuest II.

So why do we do this kind of thing? Because we, as a species, love to play, and we love to play with the things we love. It's what we do. Belonging to a group is also very important, historically speaking we don't tend to survive on our own. Play is often done with others, and aids group-formation, and showing knowledge of in-group references like "holy hand grenades" shows group membership. Humans have been called hyper-social beings, and now we have all of this social media and here I am using some of it to show the sociality of some other people using a Monty Python reference. Of course I am also showing my group membership to this geeky crowd, since I am playing Fallout: New Vegas and I knew what the holy hand grenades were.

However I have reached a bit of an impasse with Fallout: New Vegas. One of the quests is glitched and you can't get past it, so it effectively stops your game right then and there, no way around it, unless you are on the PC. (I can maybe work past it by going back to my last save and ditching the companion who causes it.) I am not, however, the only person to feel that there are too many bugs in the game. One thing that bugs me is the large number of floating rocks and corners of houses where light comes through. There is even one rock formation that is not "closed off" on one side, so you can walk into it (to the far SE by the miner's camp near the lake, a bit to the left of the camp on that slope).

Friday, December 17, 2010

When There Is No Digital Sediment

I may have been thinking about the ancient peoples of Europe, or even Neanderthals, at the time, but I realized one thing some virtual worlds lack today is the archaeological record found in the layers of sediment put down by time. There is no digital sediment there.


This was not always so, I don't believe, particularly with MUDs. In the non-virtual world space, with Google and the Wayback Machine, there is plenty of digital sediment. Wikipedia is all digital sediment. In MMOs, I don't think there is any, although I don't know every MMO in this manner -- in EverQuest II, if you stop paying for your house or guild hall, it's off-limits. Any "history" was put down by the game designers and isn't the same as the history of the digitally lived experience, even when that "history" refers to the original EverQuest.

In Second Life, you have to buy space. I am not entirely sure what happens with space that is given up -- I'm under the impression it gets sold, either to another user or a land broker, but either way the digital objects are usually wiped and something new is put in, with no layering on the past, just an erasing. Although it make digital building easier -- and I am a bit hesitant to call it "building" since it has very little to do with building in the physical sense -- it certainly does away with one of the usual conditions of the real world. I imagine if the owner of a zone doesn't want it, and can't sell it, it gets erased. Eventually Second Life will have to shrink, but I don't follow Second Life currently (it's creepy) and I doubt the Lindens will trumpet the loss of "land" like they trumpeted when all the corporations and embassies moved in (when the corporations moved out, it wasn't the Lindens talking about it).

MUDs, since many of them were non-commercial and run on some university server somewhere, just built up stuff over time. There weren't exactly layers, but there were additions and additions and more additions.

Some games use history as a feature. Dwarf Fortress is one; once you dig away the rock in the mountain, it's gone. Sure you can essentially fill it back in, but not with original stone, you have to make floors and walls instead. The history of the space is often there. SimCity, from what I recall, had some features that were easy to undo (like zoning) but other ones that weren't. Overall, though, you could re-build anything in-game, erasing everything before it.

But if we want to deal with virtual worlds as real-ish worlds, or as real worlds but in digital form, we need to remember what qualities digital worlds don't have. History is vitally important to us, but often the digital sediment of the past is missing. It's not presented as a bug, it's presented as a feature.

Friday, December 10, 2010

The Memex Contains and Is You

I am sure someone has pointed this out before, but I was looking at a blog post and there was the now-common "Like this on Facebook" link, and it struck me that we are the connected document.


It's like what they did in Caprica, which I thought was a good show. They portrayed the initial virtual world as full of porn and violence, and they built online independent avatars based off of the giant database of information about any particular person (and behavioral algorithms).

We can see the beginning of this distributed database, there is the more social side of it (Facebook, etc.), and the more commercial side of it (how that information is used to select advertisements to place on a page you visit). It is a web of connections and data, where you are at the center of it. Your online self is represented by this data, but in a stronger way that just it being a record of your actions online. With enough connections and enough data, it is you.

Reading Links (Mostly Wikileaks)

David Pogue about Corning's Gorilla Glass, very cool. (The odd man out in this listing.)


2600 objects to Anonymous' DDoS'ing in the whole Wikileaks thing.

Glenn Greenwald has some good and accurate coverage of the Wikileaks madness.
The New York Times has a nice piece on European reaction (bemusement and surprise) to the American fuss over Wikileaks.

Honestly I haven't seen any surprises at all from the relatively few leaked documents. It's not like the documents have outed the identity of any undercover CIA agents, that would be the Bush administration who did that.

Saturday, December 4, 2010

Play and Homage (and IP Theft)

People like to play with the things they like. If we play with snippets of music, we are called pirates by the music industry. If we play with other cultural items, like TV and comic book characters, and if we do it in spaces like Spore or LittleBigPlanet, it's seen as acceptable (although I am sure it is a bit more complex than that, I don't have any insider info from EA or Sony about it).


What is interesting is when branches of the big intellectual property companies, like Sony (with Sony Music, and Sony's movie and television production), have spaces where intellectual property is borrowed by the company itself (although since companies are not people, obviously this is done by the people at the company). So Sony's EverQuest II has lots of cultural homages (or theft) in it, placed there by the people who work for Sony Online Entertainment. One "homage" is the epic (or something) questline for the swashbuckler character class, which is loosely but obviously based on the movie The Princess Bride. Swashies are pirates, right? And pirates are an important part of the film.

(Hmm actually I should, to be thorough, see if that's a Sony film -- but there are lots of examples in EQII that aren't owned by Sony, so even if this is not a good example the point still holds.)

Ok Wikipedia says it is not a Sony-owned piece of IP. It's Fox and Lions Gate and MGM. We are still on here with this example, good.

And this is found across companies and virtual spaces, so I was pleased but not too surprised upon some reflection to discover another Princess Bride reference in Bethesda's recent game Fallout: New Vegas (developed by Obsidian). In one cave you will discover big cave rats, but, they are not just big rats in a cave, they are rodents of unusual size. If you know The Princess Bride, you know there are rodents of unusual size in the fire swamps. It is possible this is done with permission, but I really doubt it. (That would also put a big dent in this argument.)

But, the whole point of this post is that I took a photo of it (since I play it on my Xbox 360, not on a PC where I could take screen shots and have mods, which would be cool).

(The "S" down at the bottom of the photo is light hitting the S in Sharp, the brand of television I have currently.)

Names and History

A nice point by Paul Graham about what we call touchscreen devices like iPhones, which I saw on boingboing (I thought, but can't find it, maybe not) and Daring Fireball:

The only reason we even consider calling them “mobile devices” is that the iPhone preceded the iPad. If the iPad had come first, we wouldn’t think of the iPhone as a phone; we’d think of it as a tablet small enough to hold up to your ear.
This is standard human behavior, we've done it before. We present new things in terms of the more-familiar then-current things. Horses led to horseless carriages, from which we dropped the "horseless" and "-riage" part of carriage to get just car (I believe), which we still drive. There was also the iron horse (the locomotive), and the wireless telegraph (early radio before voice was used).

Wednesday, December 1, 2010

Call of Your Sofa

I was passing through Times Square the other day, sadly, it's a terrible place, and Red Lobster should really be illegal (not just because it's fast food, but because of what they do to seafood), and I was amused by this Call of Duty billboard.


The current ad campaign has lots of "normal" people, and a few television-based people (Jimmy Kimmel and Kobe Bryant, I think), running around with guns shooting at a ruined building. The people on this billboard are supposed to be normal people (they are either actors or models though). They are each holding two pistols of some sort, and have two rifles on their backs (I will assume one is an M16, and, given the game, I hope the other is a sniper rifle).

I like most of the Call of Duty games. But this ad campaign is ridiculous.

When I play CoD, and when most people do, we are sitting on our sofas. We are holding a very light weight video game controller, just one, in both hands. We do not move. We do not run. We do not sweat (the people on the billboard, IIRC, are all sweaty from running around with heavy guns).

The people in the billboard are not holding one light weight controller. The M16, according to Wikipedia, weighs about 8 pounds. So, let's say that's 16 pounds on their backs (not too much). Well they could be M4 carbines, which weigh a bit less. As for the pistols, well Wikipedia gives the weight of the M9 in ounces and grams, so about 1kg, let's say 2 pounds (not much). Of course there is kickback from when the weapon fires, which isn't anything like the vibration of an Xbox 360 controller.

The Xbox 360 controller weighs about 250g, one-quarter of an M9 pistol, so the Xbox 360 controller is 125g in each hand compared to 1kg for an M9 in each hand like in the billboard.

The whole thing is pretty ridiculous. Reminds me of the NFL's "Play 60" campaign -- you should get exercise, just not when games are on, then we need you to sit passively in front of the television so we can sell your viewership to advertisers. (Remember, football is not the product, you are the product, and your attention is sold to advertisers.)

Edit: It has been pointed out to me that I forgot the weight of the ammunition and grenades. This could be... well I don't know, but, more than nothing. I have no experience carrying around a bunch of ammo or grenades. They are all metal though.

Monday, November 22, 2010

Microsoft's Kinect

Penny Arcade has had quite a bit of writing about the Kinect, they don't particularly like it, but others are fascinated by it. Whatever their take, I am profoundly disturbed by Jenna Wortham's writeup in the New York Times (indeed where one tends to find her writeups), especially the sentence that says how hackers are "getting the Kinect to do things it was not really meant to do," because this is not at all true (besides the "not really..." part, which any good Wikicultist would flag as "weasel words" and actually be correct about it).

The Kinect was not designed to be a motion sensing device that is inherently and only part of the Xbox 360, if it were, it would have been built in. It is not. It is a motion sensing device that you can connect to something with the right connector, with Microsoft hoping that would be the Xbox 360. And if you know anything about people, you know we like to play with things, especially things we like.
“Anytime there is engagement and excitement around our technology, we see that as a good thing,” said Craig Davidson, senior director for Xbox Live at Microsoft. “It’s naïve to think that any new technology that comes out won’t have a group that tinkers with it.”
Except of course Microsoft, or the people at it, were extremely naïve, because earlier...
A [Microsoft] representative said that it did not “condone the modification of its products” and that it would “work closely with law enforcement and product safety groups to keep Kinect tamper-resistant.”
Microsoft's model has typically been one of control. Control over Windows, control over the Xbox, control over Microsoft Office, and so on. It was Sony that made it easy to load Linux onto their PS3, not Microsoft and its Xbox 360, although Sony later took away this capability (I am not sure of the politics behind that one, it may be interesting). Note that hackers have adapted Linux for both platforms regardless.

But we've seen so many instances where people do like to play with things (it's a part of who we are). For instance, Bethesda's line of games, such as Oblivion, which is available for both the Xbox 360 and Windows. There are no mods for games or anything on the Xbox, it's not part of the business model. (Mods, made by players, opposed to patches and DLC, by the company.) On the PC, however, there is a thriving mod scene (which I have written about). Bethesda supports the modders, gives them forum space, and interviews them (here is one example, and you can check out their posts tagged "modding"). The people at Bethesda know we like to play games and play with games, and we will do so whether they want us to or not. Mods can, and do, fix bugs, add new maps, zones, characters, quests, and everything: for the game producer, your customers can be developers who make the game better, for free. It's not just win-win, it's win-win-win (producer, modder, players).

Here's a recent Ten Best Oblivion Mods list from PC Gamer. Keep in mind Oblivion is over four years old already. In part because it's a great game, but in part because of the mod scene, people are still playing it.

I'm not sure, definitely, how old the modding scene is: the Internet itself is essentially a giant mod, so, 40 years. It depends on your definition. The Flight Sim mod scene is pretty old, dating back to at least 1990. That's 20 years (and Flight Sim is now, or was for a long time, a Microsoft product!). One would think that everyone would have noticed this long-standing given (I resist the word "trend" there, this is a not a trend, it is a constant).

Wednesday, November 10, 2010

Design Options

I want to talk about two designed items, the choices behind them, and the resulting ease of use: an alarm clock and a recycling can.

I have a Brookstone alarm clock, with a long-life battery so it will always (or, for longer than the rest of it will last) remember the time, like magic (that's the idea). As a user feature, they built into it the time change for daylight savings. Which is nice, since I don't have to ever change the time, it does it automatically, like my phone and my computer. Except I do have to do it, four times a year, since the US Congress changed when we change the time.

The problem is it's hard coded, and not at all flexible, and the information that is hard coded into it (what date the time change happens) did indeed flex, but the device can't. So, zero was better than two (zero changes if the clock changes the time, twice if I change the time). But now it's four, and if zero is better than two then we know four is pretty terrible.


(To be clear, it's four because it changes earlier than it now should, so I have to change it back, then it doesn't change when it's supposed to, so I have to change it again, and do this the two times a year we change the clocks. And this only works if I live in part of the US where we change the time.)

Automatic time change? Good usability decision. Hard wired? Not good.

The other item, a recycling can, I saw the other day at one of the artsy theaters on Houston street here in NYC. One problem with a lot of public-area recycling cans is that people are busy and their attention is elsewhere and they throw out trash in recycling cans, which makes it look like a trash can and more people do it. But this design had a little lid with the recycling logo on it, and you had to lift the lid, which meant you had to look at it and think, "How do I open this, aha a handle, oh look a recycling logo." It forced you to slow down a second, and think, but only a very small, easy amount.

Granted these are two different areas of design, but they both remind us of the importance of design, and how little things can make a big difference. Also, flexible systems.

Friday, October 29, 2010

Acronyms and MMOs

Back in the 1980s and 1990s, we had games like World of Warcraft: multiplayer, Dungeons and Dragons-based, online worlds. Except they were text. They were called MUDs or MOOs, MUD for Multi-User Dungeon and MOO for MUD Object Oriented (a comment about the programming behind it).


We also had Dungeons and Dragons and a whole slew of other role-playing games, called RPGs.

Notice these all adhere to the three-letter acronym standard, which also has a three-letter acronym, TLA.

Eventually graphics and bandwidth improved, and we had graphics-based versions of MUDs. People started calling the MMORPGs, which was just stupid. No one called them MUDRPGs or MOORPGs. MMO would have not only fit the TLA standard, it would have thematically matched MUD and MOO, besides being visually similar to MOO (MMO, MOO, although they are "m-m-o" and "moo" like a cow, respectively, and MUD is "mud" like dirt).

Currently there are some people who use MMO, thankfully, yet there are, rather oddly, others who refer to MMOs as MMOGs. This is strange for three reasons:
  1. It does not conform to the previous acronym method for these objects (MUD, MOO).
  2. It does not conform to the TLA standard.
  3. There is no other "MMO" so it's not as if MMOG is a clarification, the G is extraneous.
You could argue that MMOs are indeed Massively Multiplayer Online Games, and so you have to have the G, but given that an acronym drops so many letters already, why not one more? We already lost the Role Playing part. MMO is still just as clear, and as an acronym is 25% shorter than MMOG. (Overall, MMOG drops 27 letters, and MMO drops 28, which is not a significant difference in the loss of original information.)

Monday, October 25, 2010

Apple, Handhelds, and Disruption in Markets

Every time I see an advertisement for a touch phone, smart phone, or whatever you call them, I can only think of how Apple created this working technological system and the market. Really they are touch-screen handheld computers, but they come marketed like phones, so we call them phones and think of them like phones--computers had modems built in, at one point, and we could actually make phone calls on them, but no one ever referred to a computer as a phone.


I don't want to say "the market for these devices wouldn't have existed without Apple," since that's not true, the market did exist before Apple made the iPhone, and that's why the iPhone immediately sold so well. However it is true that these smart phone, or touch-screen handhelds (TSH is not a good acronym, though), would not exist today without some company having made the move first and having a positive market response, and in this case, such an overwhelmingly positive response that every other phone player wants in on the game.

And that's why we have to think about disruptive technologies.

I am not, by far, the first to point this out. The radio powers in some decade in the past sat on the newer, better, FM technology since their entire empire was built on AM radio, and their manufacturing process didn't make radios that could tune to FM signals, and none of the radios out there could tune to FM. Eventually they rolled it out. I have read how companies fear new technologies, since it is risky, changing what you are doing is risky, and people fear losing their jobs if the company shifts to something new.

Thus, Apple. Not a player in the phone market. No one at Apple stood to lose much if the iPhone wasn't a hit. But they also had experience with the iPod, the iPod as part of the larger iTunes system, and they could learn from their earlier Newton, from Palm, and from RIM's Blackberry. And given than the iPhone is really a computer, Apple had experience in that market.

None of the established phone makers would have made an iPhone knock-off, like they all do now, it was simply too risky. They didn't have the experience, they were locked into thinking about cell phones and not touch-screen hand helds, and they didn't have an existing infrastructure into which they could connect the device (iTunes). It is possible that individuals or teams at the more innovative cell phone companies tried to push an iPhone-like idea, but it wasn't until Apple blew the cell market open did anyone else make one. And of course it isn't actually the cell market, it's something new that people took a while to figure out, just like the iPad.

Saturday, October 23, 2010

Sci-Fi and Victorian Grammar Rules

I've seen commentary somewhere about split infinitives and Star Trek ("To boldly go!"), but it only recently clicked with Star Wars and "These are not the droids for which you are looking for," with its preposition. I've written about Victorian grammar rules before (and according to Google I am pretty much the only person who refers to them as that, which is... odd... given that I didn't make up the phrase) and a piece by David Foster Wallace about such things.


Now, if anyone can pull off using formal written English when using spoken English, Alec Guinness was one of those people, but it would have been jarring.

For the Star Trek intro quote, splitting the infinitive makes for better cadence/rhythm/meter.

If we look at sci-fi as being about the possible, well, there you have it.

Sunday, October 17, 2010

Facebook's Insane Application Allowances


I fired up Apple's iPhoto Uploader for Facebook, and I hadn't done so before so had to connect the app with my Facebook account. To do so you have to give permission to the app to do a variety of things (see photo), maybe. Although presented here in a nice visual list, this seems to be the standard list of things you have to give every app in Facebook when you want to use it (so I use practically no apps).


The list is insane.

iPhoto Uploader doesn't need to do any of those things. Does this mean it is going to? Or it might? Or is it just standard boilerplate and it will only upload photos and nothing more? Standard boilerplate would make the job of the people at Facebook a lot easier -- "Just use this text, the lawyers approved it, no app does all that, most do very little." But it's not at all clear what iPhoto Uploader will do, and I just said it could do all of those things.

I don't want it to, and it doesn't need to, post to my wall, access any of my information, access relationships, and it certainly doesn't need to access my friends' information. No way no how. And really it shouldn't, that would be a ton of info that I don't think Apple needs, although in this age of data mining who knows. All it needs to do is access my Facebook photo space to upload photos.

Edit: Oh look at that. "New Facebook privacy breach involves apps leaking user data," at boingboing.

Friday, October 1, 2010

Community and Dwarf Fortress

I haven't posted much lately, besides working on the book I've been playing studying the game Dwarf Fortress (available for Windows, OSX, and Linux) and the culture of its fans. It is a difficult game for at least two reasons: One, it is a difficult game (you have to micromanage a ton of stuff, more as you grow your fortress), and two, the interface is brutal. And there is no winning the game. There is no win condition. There is only eventual death for your Dwarves.


The game, without an added tileset, is all in ASCII. No, not text, ASCII. So yes it looks like text, but it's not words. The 2D world is presented in ASCII (but it's a 3D world, you scroll up and down through levels). That E over there? An elephant. T? That's a troll. O? A giant Olm. Lowercase c? A small cat, aka a kitten. But not everything is letters. I use a tileset, since I found the original flavor of DF absolutely impossible, especially when combined with the difficulty of the game for newbies. (So actually those letters are the ones I get in the tileset, they may or may not be the ones in the un-tilesetted game interface.) Here are some screenshots at Bay 12.

Try reading this review, which says, "nothing about it is simple. Dwarf Fortress is immense, hugely complicated, insanely detailed, and uses only ASCII characters for graphics." And more!

But this does not stop the fanbase. In fact, given that people often emotionally buy into something stronger if they had to put more effort into it, we can see why this might be so (and is for some, if you can get over the learning curve, which is really more like a learning cliff it is so steep). Here is a simplified flowchart of the game. Keep in mind some DF fan made that flowchart.

But we see a lot of things that people do with other stuff are the things people are doing with Dwarf Fortress.

Fans made a wiki, and have updated it across major version changes. (You will want it open whenever you play.)

Fans made tutorials, since the game is impossible without them. (Here's one, and here is the wiki page with several.)

Fans have modded the game with tilesets (again, impossible without, except for the real die-hards). Here's one that is pretty amazing. Here's a page with links to mods and tilesets.

Fans have made DF art and stories based on gameplay. (Well worth a look, great art style, and funny, note the grim humor.)

Fans start up a new game, and then hand it off to someone else, to let them run the fortress, and they write it up. (This is for an older version of DF, which I think had just one level. Note the grim humor.) (To use and paraphrase Sony's and MM's LittleBigPlanet tagline, which I use all the time, they played, they created, they shared.)

There are of course forums, where players help out by answering other players' questions (so I learned that flux needs to be on the same level as your smelter and forge or it won't be available, it's bugged and was driving me crazy).

It's fun. (I didn't link that for no good reason, by the way. Note the grim humor.)

But the fun (as defined above in that link) is pretty amazing, and relates to the absolutely grim sense of humor that most DF players exhibit when writing about DF online.

Here's a line from a current (may change!) wiki entry: "It is unknown whether this is a bug or a feature."

From the Bronzemurder saga that you should have read (really):"I play Dwarf Fortress. Sometimes I wish I was a meth addict instead."

There seem to be a lot of stories of accidental floods. Elephants. Elves. Forgotten beasts. Dwarves falling down wells (apparently they do that). Dying of thirst during winter if you have no water source (oops!). On occasion they kill each other. "Things of that nature" where "things of that nature" include pretty much anything.

There are many other examples that I've seen, and that you can find online. I won't say "I will try to post them" since if I never do then it will be one of those never-corrected online sentences, posted and forgotten.

Edit: A Master's thesis on Dwarf Fortress? Why yes, the MIT Comparative Media people have been there and done that.

Friday, September 10, 2010

Updated Google H Score: More Begets More

Here is a one-year later followup to my post about my Google H score. As you can see, the papers that were cited more often a year ago were cited even more often, in a mostly exponential manner. More cites meant even more cites. I think the general mechanism is that, as a paper is cited more often, it will show up in literature (as a cite) that people read about a given topic, so people are more likely to cite articles they see widely cited (assuming the article is relevant, and hopefully people won't cite an article unless it's good, so to some extent "widely cited" relates to quality). Higher-cited articles are generally listed higher in Google Scholar searches as well. Newer articles about hotter topics will break this pattern (apparently I am not writing recently about hot topics!), but I think this is one factor. Of course if more cites means higher quality, this could just be an issue of quality. I have the feeling that source journal also plays a part, but that's for a variety of diffuse reasons.


Notice that my Google H score is still 4. Bummer! I need 3 more cites on "A Cross National Study..." paper to get my Google H score to 5.

Article (short title)JournalAuthor(s)Year20092010Incr.
Mechanisms of an online public sphereJCMCSolo2005*25*4217
To broadband or not to broadbandJoBEMCo20049101
Honey, I shrunk the world!MCSCo20068124
Playing Internet curveball...ConvergenceSolo20067114
A cross-national study...TISSolo2007121
Copyright notices...JCMCSolo2008*1*10
Global citation patterns...IJoCSolo2009000
Stratification and global elite theoryIJoPORCo2009000

Values as of Sept 10th, 2009 and 2010.
* indicates one self-cite, relevant, honestly!
Neither self-cite affects the Google H value.
The numbers fluctuate from time to time, up and down.

Wednesday, September 1, 2010

Touch is not a Natural Interface

There is a rather dismal article over at the New York Times, To Win Over Users, Gadgets Have to Be Touchable, by Claire Cain Miller. (I think my expectations are too high for the NYT, but that's another story.) The main idea of the story is that the current touch interfaces (thank you, Apple) are "natural" and we don't need to learn them, we know them already.

Unlike past interfaces centered on the keyboard and mouse, natural user interface uses ingrained human movements that do not have to be learned.
But this is not at all true. A lot of the things we do with touchscreens are the exact same things, conceptually, that we've been doing with GUI interfaces since 1984 (GUI, for those of you who have forgotten, stands for Graphical User Interface, meaning, a mouse, windows, file icons, folders, probably a desktop, in other words, not the text-only CLI, which is Command Line Interface).

If you want an app to launch on your iPhone, you touch it.

If you want an app to launch on your Mac, you touch it with your mouse, which is your onscreen finger. Technically you mouse over it and double-click it, but, it's the same concept. Your mouse cursor is always in the screen, your finger spends most of its time not physically touching your touch-device screen. Same thing.

I could go on, but let's look at text. If you want to select text on your iPhone, you hold your finger on it and expand or select the amount of text you want. If you want to select text on your Mac, you have to take similar direct action on the text with the mouse cursor, which may involve clicking (perhaps with the shift key) or click-dragging or double-clicking.

It is, again, the same basic concept.

In some ways, it is not "ingrained." Widespread literacy is a historically recent phenomenon, but I doubt the ancient Romans poked their finger at text on a scroll and expected anything to happen.

In other ways, it is ingrained, because by copying the concepts from desktop, GUI-based, operating systems, touch-screens were copying a computer front-end that tried to mimic our human interface with our real-world offices: files, folders, trash cans, and a desk (desktop). If you want to do an action on an item, you poke it with your mouse cursor.

I've seen many a touch-device running Windows, where the mouse cursor follows your finger. I cannot remember what these devices are, they may be airport check-in kiosks. (I am pretty sure some were.) Your finger is the mouse.

Of course if you have an iPad or iPhone and there are no buttons on it, well, what's left to try? Touch.

I am reminded of a scene in one of the Star Trek films (the one with the whales, and Chekov's famous line about "nuclear wessels"). Scotty sits down in front of a classic form factor Mac, and tries to talk to it like he would his futuristic computer. The current-day assistant looks confused, and then hands Scotty the mouse, saying "Try this." Scotty picks it up and tries to use it like a microphone. (Eventually he gives up and uses the keyboard, which isn't a realistic approach to what he ends up doing, but that's besides the point.)

Natural interfaces are learned. (We spend many years learning how to interact and interface with the real world and especially people.) They may be easy, but they are learned.

Tuesday, August 31, 2010

Connect: Online Avatar Dancing

I've just started Musicophilia, by Dr. Oliver Sacks (author of several great books I have read), and so want to present a first-draft section I have on dancing, specifically why so many people are all about dancing avatars, videos of dancing avatars, and why we like dancing in general.

The footnotes are a bit unclear, but at least Scrivener did the copy (of copy and paste fame) well and automatically included them at the end, in a nice list format. I'll add some title information.

Dancing (draft), from Connect.

Dancing is a big thing on the Internet, especially YouTube, where you can find real dancing (like the music videos MTV used to show), and seemingly pointless but occasionally funny videos of avatars dancing in MMOs like World of Warcraft, EverQuest, and There.com. People like dancing so much that MMO companies have built dance moves into the capabilities for avatars. It certainly doesn’t help your level 7 elf battle orcs, but that’s what people wanted.

Not everyone is into online dancing, as the Reuters in-world employee, Eric Krangel, found out. “As part of walking my ‘beat’, I’d get invited by sources to virtual nightclubs, where I’d right-click the dance floor to send my avatar gyrating as I sat at home at my computer. It was about as fun as watching paint dry.”[1] The problem is Krangel wasn’t there for the dancing and the music and the text-chatting. He wasn’t a part of the community that likes that kind of activity. He was there as a Reuters employee to sell the Reuters brand. He could have gotten into it, but that he didn’t isn’t a big surprise.

Dance is a form of ritual, a concept that receives attention from researchers, and as a form of shared and coordinated play it can lead to community bonds. It is a very old human behavior. As psychology researcher Fitch observed, “music and dance are found in all cultures, and have been for many thousands of years.”[2] Lee, writing about the history of ballet, pointed out that, “Throughout the ages, a wealth of documentation in the form of cave paintings, Egyptian hieroglyphics, description of ancient Olympic games, and Old Testament references have attested to the importance of dancing in society.”[3] Garfinkel, writing about early human dancing, cited evidence for dance in the Middle East and Europe as far back as the 8th millennium BC.[4]

Although we have a lot in common with our fellow mammals and primates, McNeill observed that “community dancing occurs only among humans.”[5] In further contrast to other animals who have behaviors that we refer to as dancing (like bees), humans dance in groups in a synchronized manner to music, which other animals don’t have, and music is “fundamental and central in every culture” writes Dr. Oliver Sacks in his book on music and the human brain.[6] In fact, dancing and music are tightly related in our brains. As Berkeley professor Walter Freeman explained, “music together with dance have co-evolved biologically and culturally to serve as a technology of social bonding.”[7] A shared ritual that fosters community, the two are “the biotechnology of group formation.”[8] The current English word play is related to an older and similar Old English word, but, according to the Oxford American Dictionary, it is also related to the Middle Dutch word pleien, which, perhaps not surprisingly, can mean dance.[9]

Dance is a form of communal play, and is clearly an important part of who we are. Knowing this, we should not be surprised to find it online in some situations where it seems to have no point for the virtual world, as indeed we do.

[1] Neate (2009). ("The biology and evolution of music", in Cognition, v. 100)
[2] Fitch (2006), p. 199. (In The Telegraph.co.uk)
[3] Lee (2002), p. 1. (Ballet in Western culture)
[4] Garfinkel (2003), p. 106. (Dancing at the dawn of agriculture)
[5] McNeill (1995), p. 13. (Keeping together in time)
[6] Sacks (2007), p. xi. (Musicophelia)
[7] Freeman (2000), p. 411. (In The origins of music, by Wallin, Merkur, and Brown)
[8] Freeman (2000), p. 417.
[9] See also Huizinga (1955), p. 31, for more on play and dance. (Homo ludens)

Thursday, August 26, 2010

Recent Review Scores

Speaking of reviews, here are the scores from the ten reviews that Dr. Skoric and I received on a conference paper submission. We love you too, reviewer #1.


#1 #2 #3 #4 #5 #6 #7 #8 #9 #10 Av.
Relevance 2 3 2 2 4 4 4 4 4 4 3.3
Theory 1 3 2 4 4 3 3 3 4 2 2.9
Methodology 1 2 3 2 4 2 4 3 4 4 2.9
Presentation 2 4 3 2 3 3 4 4 4 3 3.2
Validity 2 2 3 2 3 3 4 3 4 3 2.9
References 2 4 4 4 4 3 3 4 4 4 3.6
Contribution 1 2 3 3 3 3 4 4 4 4 3.1
Originality 2 4 3 3 3 3 4 4 4 5 3.5
Interest 2 4 4 4 4 4 4 4 4 4 3.8
Average 1.67 3.11 3 2.89 3.56 3.11 3.78 3.67 4 3.673.24

5-pt. scale.

Tuesday, August 24, 2010

Evolved Systems—Academic Peer Review

Too often I see calls for reforming the academic peer review process, or, another favorite, for the eradication of academic tenure, and you read the article and you realize the person has no idea what they are talking about.


There's another such article over at the New York Times, by Patricia Cohen, Wikipedia Age Challenges Scholars’ Sacred Peer Review. There are so many things wrong with the article, it's almost a challenge to know where to start. Let's start at the beginning, then, since I already gave you the summary.

Peer review is not "sacred", as in the title. It's not clear what that means. Peer review is a process that evolved over a period of time to serve a need that scholars have, which is, figuring out which articles are any good in a way which works.

The first sentence continues the misguided commentary. "For professors, publishing in elite journals is an unavoidable part of university life." As a writer, you like your first sentence to be accurate, among other things. Publishing in elite journals is not at all "unavoidable". It is quite avoidable: don't submit papers to elite journals. Trying to get published in elite journals is usually required, but that is very different from publishing in them. In fact, if the journals are elite, they won't publish papers by most people. And if you're at a teaching college, publications are not as important as they are at research universities.

The second sentence continues the comedy of errors: "The grueling process of subjecting work to the up-or-down judgment..." No. Subjecting the work to review is easy, you send it in. The grueling part is doing the research, and then writing it up perfectly. This means you have to have a great research question, great data, great methods, and interesting findings. And it's not an up-or-down judgement at all. For all of the journals I have reviewed for, it has never been up or down. I've done reviews for five different journals, and two conferences (and I have reviewed every year for the conferences, and more than once for some of the five journals). It's always a range of options for journals, and you give comments. Usually, the options are like the following:
  1. Accept as is.
  2. Accept with minor revisions.
  3. Resubmit with minor revisions.
  4. Resubmit with major revisions.
  5. Reject.
As you can see, that's not "up-or-down." Resubmits can be rejected, or sent back a second time as another resubmission, where they can be accepted or rejected (usually at that point they don't let you resubmit it again).

Cohen refers to peer review as a "monopoly". It's a process. She continues,
Instead of relying on a few experts selected by leading publications, they advocate using the Internet to expose scholarly thinking to the swift collective judgment of a much broader interested audience.
Relying on experts for their opinion on the article is the entire point! These experts are not "selected by leading publications," reviewers are selected by the editors at the journal where the paper has been submitted. Use of the word "swift" implies that peer review is slow. Well, not like we get paid for it. And not like it counts towards tenure. We do it to help make journals better, which makes the field better as a whole. Sometimes you might get 10% off books by the publisher which publishes the journal. If you do a good job, the editor might remember it, but that also means more reviewing for you in the future. And relying on "a much broader interested audience" is a terrible idea, papers should not be judged by an interested audience, they should be judged by an expert audience.

As a writer, you don't want too many reviewers, because at some point they will disagree with each other and then there is no way to change the paper to make them all happy (which is the only way to get the paper published). Note that this is not pointless reviewer-pleasing, if they approve of the paper that means the paper makes a positive contribution to the field. I co-authored a paper for a conference recently and there were ten reviewers, which makes our job as authors very difficult because many of the reviewers disagreed with each other (but they have no idea that they are disagreeing with each other, this is not publishing by committee).

Regarding one of the crowd sourced peer review projects, Cohen writes, "In the end 41 people made more than 350 comments." As an author I don't want 350 comments. I want maybe five sets of comments at most, from qualified people.

There are a lot of other errors in the article, too many to detail at length. Here's one:
The traditional method, in which independent experts evaluate a submission, often under a veil of anonymity, can take months, even years.
This is not true. For the two most recent reviews I have done, I had a one-month deadline. It is true that for one of my co-authored papers it took over a year to finally get it published, but that's because the journal changed editors and they lost the paper in the switchover, and because all three of the authors moved. The editors were also slow, but if you're going to be slow about "traditional" reviews (which are all done by email these days), you're going to be slow about web-sourced reviews. All of the things by which you are judged and upon which your job depends (teaching, research, conferences, publishing, academic service) don't just vanish because you switched to web-based reviews.

And the "veil of anonymity" that Cohen claims "often" exists is not real. Reviewers are anonymous to the authors, but the editor knows who you are, and the editor or editors will read your comments. If you're rude or out-of-line, they may send your comments back, and you will have tarnished your reputation. But the reviewers doesn't know who wrote the paper (although at times you may have suspicions, or if you think you know you tell the editor and recuse yourself), and that's important. The reviewer judges the paper on its own merits. The reviewer and the author are anonymous to each other, but the editor knows who both are, and as such fills an important gatekeeping function. By lacking anonymity with the editor, both reviewers and authors want to do a good and accurate job (which may mean rejecting the paper as a reviewer, as I have done several times). But by simultaneously having anonymity with each other, the authors and reviewers can focus on the words of the paper and the reviews. As a reviewer, I can crush some poor sod for writing a terrible paper and know they won't hate me, as a writer I know the comments are solely about the paper and not about me (and I won't hold a grudge against the reviewer for not seeing the genius in the paper). This is a very important part of the process.

Clubby exclusiveness, sloppy editing and fraud have all marred peer review on occasion. Anonymity can help prevent personal bias, but it can also make reviewers less accountable; exclusiveness can help ensure quality control but can also narrow the range of feedback and participants. Open review more closely resembles Wikipedia behind the scenes, where anyone with an interest can post a comment. This open-door policy has made Wikipedia, on balance, a crucial reference resource.
What a horribly wrong paragraph. "Clubby"? Reviewers don't edit. Fraud? Perhaps Ms. Cohen has never heard of the Internet, speaking of fraud? As I just mentioned, reviewers are not anonymous to the editors, and are fully accountable to them. Exclusiveness does indeed help comment quality, which is the point. Limiting the participants is the whole idea. Otherwise, I'll just go get all my friends to say how awesome my paper is and I'll do the same for them. It is not clear what she means when she says Wikipedia is a "crucial reference resource," Wikipedia is, generally speaking, a giant mess of articles that seem correct enough to those who speak the loudest. Academic databases are crucial references resources, Google Scholar isn't bad, but Wikipedia is a disaster.

The beginning of the second page is a little better, and is actually accurate, which is a surprise based on the first page, however it degenerates into an error-filled comedy once again. For example,
Advocates of more open reviewing like Mr. Cohen at George Mason, argue that other important scholarly values besides quality control — for example, generating discussion, improving works in progress and sharing information rapidly — are given short shrift under the current system.
Well, no. Discussion is what happens with your colleagues, on mailing lists, discussion boards, blogs, in the hallways, at the coffeeshops, and at conferences. Improving and sharing work is what conferences, email, and posting items online do. They have nothing to do with peer review.

Oh and we see the New York Times following its annoying idea about how people with PhDs aren't called Dr., as we deserve to be called. I could write, "Mr. [sic] Cohen...." but I think I've pointed out more than enough errors.

Saturday, August 14, 2010

Play and Buttons and Fingers

I was reading a New Yorker article, "Painkiller Deathstreak: Adventures in Video Games", by Nicholson Baker in the August 9th, 2010, issue, and, concerning the "seventeen possible points of contact" for his fingers and the Xbox 360 controller that he may need to consider to play a game and do one of the many actions he lists (like run, crouch, aim, fire, pause, leap, speak, stab, grab, kick--actually I think he lists 17), he writes...

It's a little like playing 'Blue Rondo a la Turk' on the clarinet, then switching to the tenor sax, then the oboe, then back to the clarinet.
So, yes, crazy mad finger positioning that you had better know ahead of time, like I was talking about previously.

Monday, August 9, 2010

Ukulele: Play, Create, Share

(To be clear, "Play, Create, Share" is the tagline from Sony and Media Molecule's game, LittleBigPlant.)


I had finished most of the book--all the research, and all the body chapters were roughly at first draft stage. It was off at some publishers, so I decided.... needed, to take a break from it before the final push to make the intro better, tie together the loose sections at the end, and write up the conclusion. So, I decided to take up the ukulele and learn how to play it. (Ukes are cool, check out the vids in this post.)

The ukulele is a decidedly non-electronic, non-Internet beast. It is real, and tangible, in your hands, as are the calluses you might get. It has no built-in spellchecker, but you can hear when you hit a wrong note, which is curious and encouraging.

My cousin, who plays in a Uke band, made me a song book with chording tabs--these show you where your fingers go on the strings in order to play a certain chord (a combination of notes). The tabs are from the Internet. I email her, a friend, and my uncle, who was in a real band for many years, about playing stringed instruments. (I also have a few other relatives I can email about these things, but focus on the Internettedness of it all.)

There are tons of YouTube videos with people playing ukulele. They just do it. (My point is that we are driven, psychologically, to play with things like musical instruments, to create things like videos, and to share them -- all of the activities create and reinforce community, because we are driven to connect.)

Note this one of this kid, which has almost 24 million views in under a year. (1:18 ftw!)

There are also thousands of tablatures online, for a variety of instruments (although I mostly pay attention to uke and guitar tabs). People have made these, put them together, and put them up to share with others so that others can play too. (One frequent note on them is a rather weak write up defending the tab in terms of US fair use, which could be written a lot better.) Here's one with all of the songs by The Smiths (and it uses the same layout as this blog). (Hmm I had one for the Beatles which was kinda cheesy but did the job, but I don't see it now. Oh here it is, click through to a song hosted on the site and you'll see what I mean.) You can tune your uke online.

There are also groups, of course, a.k.a. online communities, like the Ukulele Underground, who host discussion boards and have instructional (and awesome) videos: ukulele lessons, ukulele minutes, and member videos. Yes, member videos, made by people and posted to the site.

Play.
Create.
Share.

It doesn't matter if it's video games and mods, it doesn't matter if it's a more physical and just as visceral object like a ukulele, it's what we do, and the Internet allows us to express this playfulness, this creativity, and allows us to share these things we love to do, since everyone loves to do them. (I'll point you to Stuart Brown's Play if you don't believe me, and you can check out the NPR/SOF show where he was interviewed.)

And, as I said before, and as Brown points out (in the book at least), these are all community creating and reinforcing behaviors. I could also talk a bit about the visceral, long-time importance of dance and how our mental structures which relate to dance are connected to the ones that relate to music, but it's been a while since I wrote that part of the book so it's a bit rusty. Perhaps later.

The thing is, we do this (connect) with possibly everything.

Saturday, August 7, 2010

''They'll read everything.''

So says Bruce Schneier, an author and chief security technology officer at British telecommunications operator BT in an article about the ongoing BlackBerry negotiations (I copied his job description from the article, to be clear, but if I put it in quotes it makes it look like the description is misleading). Specifically, he's referring to the Saudi government and the recent BlackBerry data dust-up. "They'll read everything."

And, as I noted, they probably already do read everything else. This still implies that the countries that aren't making a fuss are already reading all the RIM/BlackBerry data they want (and everything else), especially Western nations. The nations mentioned in the NYT article are:

  • Saudi Arabia
  • India
  • The UAE
  • Indonesia
So, one must assume that the US...
  1. Gets all the data they want from RIM's BlackBerry service.
  2. Doesn't share it with any of those above countries in a way they like.
Apparently there are already local servers, or deals for them, in Russia and China. The article says, "Schneier said the Saudi arrangement is similar to deals RIM has struck in Russia and China," which is not exactly clear if those have happened or will happen.

However, RIM "issued a statement last week denying it has given some governments access to BlackBerry data." So, it's not really clear. And, one can safely assume that some governments, like the US, don't actually ask in a way that RIM would have to refer to as "giving", perhaps it's more like "taking."

One also assumes that the US government would share anything important it discovers with the Saudi government if it were relevant (and vice-versa), so I don't see that terrorism is really an issue, and maybe various agencies aren't actually getting along as well as they should. Or, as mentioned in the article:
Critics maintain that Saudi Arabia and other countries are motivated at least partly by a desire to curb freedom of expression and strengthen already tight controls over the media.
Sadly the article does not actually name or interview any of these critics.

Addition: A better article, finally, from the NYT.

Tuesday, August 3, 2010

More Crowdsourcing Confusion

There's an article today at the NYT about the Stardust@Home project which shows some of the clouded thinking and definitions around "the crowd."


So far, they've found three of the ones they are looking for in their aerogel, and each is apparently about 1/25,000 of an inch big. Wow. Go, science!

But, there's either a lot of empty aerogel or a lot of space dust of types they aren't looking for in their aerogel (the article isn't quite clear), so they needed help finding what they were looking for. The scientists threw it out to people on the Internet, which is where the article goes a bit off the rails.

So, the scientists have...
Help from an army of amateur researchers.
Are these people the crowdsourced people, or interns? It's not clear. I think it's the crowd, but the crowd isn't "amateur researchers," although one hopes to find such people in the crowd.

The scientists turned to non-experts around the world to sift through thousands of images.
That's fine, but if all the "non-experts" are doing is looking at pictures, well, pretty much every sighted human being is an expert at that. That's one of the things we are built to do, our survival depends on it (although these days, historically speaking, blindness is not as big a drawback as it was when we were in pre-history), we are in fact experts at it. They may not have PhDs in astronomy, but to look at a picture and determine some factors about it (angle of dust particle, I believe), you don't need to have a PhD in astronomy.

But,
Interspersed test images allow the researchers to check how well the dusters [the people on the Internet who are looking for space dust] are doing.
So we have them, but we can't quite trust them, although to be clear this is probably more of a visual quality check than a moral one, and it is one that scientists often take anyway with data (speaking as a scientist, we like data checks, it makes our data and results better).

The first presumed interstellar particle — actually two distinct pieces — was found by a Canadian duster, Bruce Hudson, who retired as a carpenter and groundskeeper after a stroke. Mr. Hudson said he had looked through 25,000 images, spending as much as 5 to 10 hours a day at it.
That isn't just a guy from the crowd, that's a pretty dedicated person! Most of the crowd isn't like this. But that's the often-overlooked point about the crowd, you don't want the crowd, you want the people in the crowd who might want to help, and distinguishing them ahead of time is difficult. It's easier to let them self-identify by giving them the opportunity to do so.

The person who found the second particle said,
“Although I spend my working days in front of a computer solving problems and verifying designs, I found it was quite relaxing to look through the photos and concentrate on the visual images.”
They're looking at images. If you have a working visual system, you're an expert.

Sunday, August 1, 2010

BlackBerryBlocking

In today's NYTimes, via the AP, Saudi Arabia to Block BlackBerry Messaging. The UAE, however, decided first and Saudi Arabia decided to do so as well, which makes for a rather odd headline. The UAE isn't instituting the ban until October 11, and why they are taking so long isn't addressed in the article at all. (Saudi Arabia will, maybe, block messages "later this month.") I assume they could do it tomorrow if they wanted. Is it more a case of posturing, to get some local control over BlackBerry messaging?


From the article:
Regulators say the devices operate outside of laws put in place after their introduction in the country, and that the lack of compliance with local laws raises ''judicial, social and national security concerns for the UAE.''
However,
Regulators said they have sought compromises with BlackBerry maker Research in Motion on their concerns, but failed to reach an agreement on the issue.
Sounds like strong-arming. Why is this needed, and why only with BlackBerry?
Unlike many other smart phones, BlackBerry devices use a system that updates a user's inbox by sending encrypted messages through company servers abroad, including RIM's home nation of Canada.

Users like the system because it is seen as more secure, but it also makes BlackBerry messages far harder to monitor than ones sent through domestic servers that authorities could tap into, analysts say.
Ah, the surveillance society! I would assume the US already intercepts and decrypts all the BlackBerry info. Apparently we aren't sharing enough with the UAE or Saudi Arabia.

Addition: What this strongly implies, if not outright makes clear (ok it's clear, they just don't say it) is that the UAE and Saudi Arabia are already scanning/reading/monitoring/intercepting all the traffic they want on other phone devices.

The NYT decided their own writeup would be better than the thin AP wire feed, so they have an article up.

Thursday, July 29, 2010

What Bothers Me About "Inception" Buzz

Spoilers: This post has them.


Inception is a cool movie, visually speaking. But beyond that, I don't think it has much. Leonardo DiCaprio's character is the only one who is developed, but all we get is that he loves/loved his wife and loves his children. The character of the mark (the son of the dying business man) is developed a little, he seeks his father's acceptance (this is pretty thin, though).

A lot of the material I've seen online says how amazing this film is. I agree it is visually amazing. But that's not what people are talking about, it's the question of what was real and what was a dream and whose dream it was.

And all of that is irrelevant.

Does it matter if the ending scene is a dream or not? No. It doesn't. Not one bit. Because these aren't real people, and we don't learn anything about anyone from it being a dream and wondering whose dream it may be. Does it matter which scene belonged to which dreamer, and were there clues that we had been mislead? Irrelevant. That may be fun, but it doesn't tell us anything. Is it revealing about the characters? No, because it needs to be revealed more clearly, and even if it is, so what? They're characters in a movie, they're not real, there is no insight here except about what the writer, director, and producers wanted to do in the film.

Does it matter if Leonardo DiCaprio's character has been in a dream the whole time? (I think we see the spinning top work at some point, but I don't exactly recall.) No, it doesn't. It's a movie, it's not real, it itself is like a dream. There is no reality in it. If LDC's character is in a dream at the end, then he's in his dreamworld fiction inside a movie fiction, so it's fiction. If he's not in a dream at the end, then he's in the movie's real world inside a movie fiction, so it's fiction. It does not matter. There is no weighty intellectual discussion here, unless you're in college and you think you're much smarter than you actually are (and then it is neither weighty nor intellectual, but sadly it is a discussion).

It's not even as if dream and unreality sequences are new to science fiction movies, we've seen it with the Matrix, but we've seen it before with stories such as The Wizard of Oz and Alice in Wonderland. The Wizard of Oz is old, it was written in 1900. That doesn't mean it isn't an enthralling idea 110 years later, but we've seen it before.

Visually Inception is a great, fun, inventive, beautiful movie. Beyond that, it's pretty standard. There's nothing wrong with that, but all of these people and critics think there is something weighty about the dreams. There isn't.

Wednesday, July 28, 2010

Community

"It's this right here. Hanging out with your friends and fellow artists."

http://www.phdcomics.com/comics.php?n=1348


(Reminds me of ICA, and my "once a year" friends, as it should, since it's the same fundamental human need, the need to connect and form communities.)

Tuesday, July 27, 2010

Guitar vs. Guitar Hero

I was pointed out to me by my brother that playing the guitar is a lot like playing a video game: There are certain things you need to do with your fingers at certain times, and you need to memorize the moves (either specifically or generally) before you try it so it works out better.


He's right.

Having played electronic games for over two and a half decades, and finally poking about working on the ukulele, it's entirely true. It isn't true for all types of all computer games, but for example with games on the Xbox 360 or the PS3, you need to know which buttons and triggers to mash when. Some games on computers are finger-twiddlers (mash buttons!). But usually not just any buttons, the right ones at the right times. Which means, your fingers have to be in the right place at the right time, just like fingering for chords on a guitar or other stringed instrument (bass, banjo, ukulele, upright, violin, etc.).

I can see a future where the current Guitar Hero and Rock Band guitar necks don't just have five buttons and a few other controls on the body (see Rock Band guitar info), they have some larger amount depending on the complexity of the device. A ukulele has four strings, so four across by however many you want down the neck. Basses usually have four (although I have a bass with five). Guitars typically have six strings, so six buttons across by the number you have coming down the neck. A great deal more buttons, but the idea and the skills are exactly the same. Where are my fingers positioned, and when? In the TV show Californication, the young daughter plays guitar and also Guitar Hero. Can Guitar Hero be a lead-in to guitar playing? I'm not sure if it is, but it could be.

[Addition]
Here's a good point by my friend Dr. Matthew Bietz. I totally forgot to think about the next step, of production versus consumption. In guitar games, we are consumers, and copy (to some extent) the music. With a real guitar, you make your own music, or make riffs on things, or play things slightly differently and that can be a good thing. Production is much more powerful, on an important, fundamental human level.

The tagline for LittleBigPlanet by Sony and Media Molecule is Play. Create. Share., and when we play music together, that is what we are doing. Playing, creating, and sharing are all community building and community reinforcing activities, and since we are driven to connect, they are all important. Playing a guitar game, there is play, but there is no create and share.
[End Addition]

Ukulele... You may not have much love for the ukulele, but here are some things that may help you down the path.

Walk Don't Run by amazing little Japanese crocheted guys.



While My Guitar Gently Weeps covered by Jake Shimabukuo (who has his own YouTube channel). Over 5.5 million views. He goes nuts at 2:40, although there are hints of what is to come earlier and you should watch the entire video anyway. Amazing.



Here he is, at 1:10 he demonstrates the range of the ukulele. And he's at TED. Most of you have not been invited to TED. TED? Very cool. TED = ukulele.

Sunday, July 25, 2010

The Madness of Crowds

"The wisdom of crowds" was never a wise saying, really it referred to how if you have a big enough group of people, a few of them will know something about the problem at hand. It's just statistics.


Sadly we have reminders about the madness, and thus lack of wisdom, of crowds all the time. There was just one recently at a music festival in Germany, where 19 people died and over 340 were injured.

I just came across this new Cocktail Party Physics post about the inanity of Amazon user reviews (that's the crowd at work, remember). I think part of the problem is explained, of course, by the Penny Arcade Internet F***-wad theory. I know that's not polite terminology, but they did name it first, and deserve credit for proposing it. Not that all Amazon reviews are anonymous, but many are only tied to an account name. The "audience" in the theory is also called a "crowd" just to be clear. Not that crowds are all wisdom, or all madness, we know people behave differently in crowds (or, as a crowd), and that you will some diversity of people in the crowd, but that depends on the makeup of the crowd of course. Nothing new to see here, oh the waste of ink and bandwidth is unfortunate.

Saturday, July 24, 2010

"So Much For The Dell Model."

I recall the hype about "the Dell process" or "the Dell way" or whatever it was (model, their manufacturing process), and the jealousy Michael Dell had over Apple's success. We Apple fans (and maybe some others) liked to refer to Dell as "Dull" since they made dull beige boxes.


Turns out their great production process wasn't. It was all accounting tricks. Reminds me of Enron, really. You would think people would wise up, although I guess the people who are dumb enough to think they are smart enough to break the rules are also the ones who aren't smart enough to make it work or get away with it. You could blame some of the losses on the economy, but Apple is doing fine.

So, the SEC has released some internal Dell emails where the Dell people discuss how they are only making their quarterly targets with a little side money from Intel. Or, a lot. Dell's quarterly target should have been "make great computers."

My headline is from the writeup by Ashlee Vance.

Friday, July 23, 2010

Flawed Comparisons

Some comparisons really don't work, but are presented as if they are spot-on. Here's one from CNN's "Smartest People in Tech" that I found amusing, from #2 CEO Jeff Bezos' writeup (which comes after #1 CEO, Steve Jobs).

...virtually every iPad review compares Steve Jobs' tablet to Bezos' device.

Well yes of course, since readers aren't going to know of any of the few other touchpad devices out there (not touch phones, but touch pads). Although why you'd compare the two, when the Kindle does only one thing (and will probably stay that way) and the iPad... well, as the advertisement goes, there's an app for that, it's not a great comparison outside of the form factor and that the two devices are contemporaries.

Thursday, July 22, 2010

Feep and The Purple Pants

This is the current opening to my book proposal, trying to ground the higher-level ideas in an understandable and positive story. Granted, it's a story about pants, but a little bit of humor is good too.

One evening, when I was playing Sony’s massively multiplayer online game EverQuest II, one of my guildmates, whose character’s name is Feep, dropped a link for a pair of pants into the guild chat channel. He had just killed some evil creature, and the pants were part of the treasure he had received. Specifically these pants were torn purple pantaloons, which I had never heard of before. The name was unusual. I clicked on the link in the chat window to learn more about these purple pants. Up came the item description and statistics for the pants. They had a spell on them, called rage, that gave the wearer increased strength and stamina but took away intelligence and wisdom. These were not your typical pants or armor from the mythical world of EverQuest II, these were something quite different that really had no place in the game. They didn’t belong to any creature in the game, and they were from another company’s intellectual property altogether. These purple pants belonged to The Hulk from Marvel Comics.

Feep’s purple pants are just one example of the many ways connection and play are experienced on the Internet. People are playful, so Feep and I, and many others, were playing EverQuest II (often called EQII) at that moment. People like to connect, so the creators of EQII at Sony designed the game so that players would do better if they joined forces. People can form long-term guilds, and Feep and I were members of the same guild (although we had never met in real life). The programmers at Sony also made a guild chat channel, so all guild members could text chat with one another, because we like to connect, and communication builds and strengthens a community like a guild. Sharing can strengthen communities as well, and Feep was sharing information.

The homage to The Hulk was playful. The pants had to be made and placed in the game. People can make in-game items—from the mundane, like arrows, to the more spectacular, like fish tanks, and to the completely unnecessary, like toilets. People are driven to create things and are not just passive consumers. Players also make a lot of things about the game that are not in the game, such as guild websites and wikis.

But the pants were not created by players. EQII does not allow players that level of creativity. The Hulk’s purple pants were instead playfully made by the programmers at Sony. All people are driven to play, and play itself is a behavior that builds community.

To play EverQuest II, you need a computer that is connected to the Internet. Although EQII is in some ways tightly controlled (such as with what players can make), in other ways it is not (such as with the text and voice chat channels). The Internet is controlled much less than EQII, which was why Sony could go ahead and make the game run over the Internet without asking anyone’s permission to do so. EQII works, in part, because it runs over the open Internet, and players can make websites, wikis, and have real-life meet-ups. EQII as a whole takes place in many more places than just the EQII game world. EQII works because the designers knew that people like to play and, more importantly, like to connect.

The story of Feep and the purple pants highlights two fundamental human drives: the drive to connect and the drive to play.

Tuesday, July 20, 2010

Connect, in Serial Format

Since enough of my book is written and off at publishers where it should get picked up, I thought I'd present some of the writing and ideas here in condensed, serialized form. The working title is Connect: Why the Internet Works, or perhaps Connect and Play: Why the Internet Works, but I am partial to the shorter title. (Note that is is not How the Internet Works, as one of my friends objected that Why might be about TCP/IP which it is not, this is why, not how.)


Why it works is because it allows people to connect.

The major sections are the introduction, the Internet, CompuServe, Videotex, and the conclusion. The Internet is the majority of the work since it is the system that is still with us, the one that succeeded where the others failed over time (although CompuServe was with us for a long time). All three systems were started or conceived of in the late 1960s (videotex is a bit different since it is not a system like the other two, but a type of system, but the comparison works and is still narratively compelling).

I look at what we do with the Internet, and discuss two fundamental human drives that are important for what we do online: the drive to connect (with others) and the drive to play. These are what make the Internet work.

CompuServe and videotex didn't allow people to connect enough, or to be playful enough with content. CompuServe adjusted over time, but couldn't compete. Videotex was designed with control in mind, and failed miserably (no no, Mintel was different, that's a long story, but it wasn't at all like any of the US videotex efforts although no one over here seemed to realize that).

People are extremely social (the drive to connect), and many mammals (and even some birds and octopi) are playful creatures. When a systems allows us to be who we are, we use it and it succeeds. This is not just true in life but also a key to success in business (for example, see the books Drive by Daniel Pink and the IDEO book The Art of Innovation by Tom Kelley).

Most of the book looks at what we do online in a social way, so most of my examples and ideas will be based on Internet activities.

Look for posts with the Connect label. This is the first intentional one under the idea that I am going to do so, but I have some previous posts that stem from the work, so I'll go back and put the Connect label on them as well.