Showing posts with label History. Show all posts
Showing posts with label History. Show all posts

Monday, November 16, 2020

What is the 20th‐century French Revolution?

 Apparently, it is the food processor! A friend of mine pointed me towards this and it's too good to not share. A lot of technologies get hyped pretty heavily, I think the Segway electric scooter thing was going to revolutionize everything (I forgot how to spell that, it isn't "segue"), but I didn't expect to see this level of excitement for a food processor--possibly because I grew up in an era with food processors.

It has been labeled, and not without justice, the 20th‐century French Revolution. It is the equivalent of an electric blender, electric mixer, meat grinder, food sieve, potato ricer and chef's knife rolled into one. Its invention, in the minds of serious cooks, ranks with that of the printing press, cotton gin, steamboat, paper clips, Kleenex, wastebaskets, contour sheets and disposable diapers. It has, in many minds, rendered the electric blender a hopeless antique, and we are willing at any time to relegate our old faithful to the Smithsonian.

The new mechanical Merlin is a French import known in this country is the Cuisinart Food Processor. It is a multifaceted marvel that has, as the saying goes, more uses than money. It blends, slices, grinds, grates and purées, or, as one kitchen enthusiast put it, “does everything except sweep floors, wash dishes and talk back.” 

(From the New York Times Archive, article titled "Kitchen Help", by Craig Claiborne with Pierre Franey, from March 16, 1975.)

As a communication scholar, seeing the food processor compared to the printing press is interesting. I'd say it is a rather hilarious comparison, except that the article goes on to say, quoting, I think, a French food writer...

“The French are just getting to the point,” one of them said, “where they are willing to invest in something more mechanical in the kitchen than an electric hand beater. I have lots of friends with servants who refuse to install an electric dishwasher. As everywhere else, the age of the servant is on its way out in France. Five years ago almost anybody with slightly above average means could afford the cheap kitchen labor offered them in the form of Spanish and Portuguese cooks, mostly female. But that's coming to an end. Until now the French have simply not thought in terms of kitchen aids but in terms of kitchen help.”

Servants! It looks like there is a really interesting issue there with gender, class, and nationality, as well as technology.

Edit: Could it have been the 20th century French Revolution because, like a guillotine of the previous French Revolution, it chops things with a blade? (I don't think that's what the writer was going for, though.)

Tuesday, July 10, 2018

Who is Aunt Lawrence?

This is (a photo of) a black and white photo, circa 1880-1890, of what is probably a color oil painting dating from that era or earlier--I'd guess earlier due to the clothing--and on the back is written "Aunt Lawrence". I post it online in the hope that it will get indexed and recognized, because maybe someone still has the painting. (I've tried Google's reverse image search.)


The photo is in a collection from George L. Underwood (1831-1920), my 4xG uncle, who was a photographer, and I'm certain it's from his wife's family. His wife was Katherine (Kate, Catherine) Luyster Underwood, 1835-1911?, born Kate Lawrence Luyster. Her paternal grandmother was Catherine W. Lawrence, 1763-1855. CWL's father was William Lawrence, 1729-1794, who was married to Anna Brinckerhoff, 1733-1770, and one of GLU's children, Ethel, gets her middle name from Anna (her 2xG grandmother).

Even with the help of Ancestry.com, I cannot currently determine who "Aunt Lawrence" is exactly.

GLU's line died out in 1977, when the second of GLU's two grandchildren died, although the photos had been in a closer branch of my family due to what appears to be the implosion of GLU's family around 1888-9.

Update: An owner of an antique store was really kind and estimated that the painting is from between 1840 and 1870, based on the woman's clothes. One guess is then Jane Lawrence, 1783-1838, who was Kate Luyster's great-aunt (so her father would have referred to Jane as an aunt), because Jane is the only woman in that generation who lived to adulthood (according to Ancestry.com records) who wasn't Kate's grandmother (Catherine W. Lawrence, married to become Catherine Luyster, 1763-1855). Jane appears to have married a Hendrick Suydam, so wouldn't be a Lawrence after that. The records aren't always great from back then, though. Or, perhaps family members referred to Kate's grandmother as "Aunt Lawrence" and maybe that's why Kate had a photo of the painting. The date estimate is a bit better for her as it is.

Sunday, November 5, 2017

Blogging, Ephemeral

So, I've been blogging for well over ten years now (but mostly the "readers" are myself and, I would guess given the consistent 30-40 views per post, webcrawlers), but I was blogging at UM before the earliest posts listed here (2007), which are re-posted UM posts IIRC.

Which made me think about how ephemeral this all is, although I might have a folder with the UMich blog material in it.

Relatedly, I've been thinking about digital photos, and what is Facebook, Instagram, and Tumblr, for instance, it's all ephemeral. Someone a while ago pointed out how, in terms of records and photographs, we are in a terrible period, as none of it will be saved for posterity. I'm debating taking an afternoon and ordering up some print books of my photos, but am not even sure what I have anymore. I think I've lost my Belgrade trip photos, back in the days of my Canon Elf (~2003?), although I might have a CD with the photo from a friend who was on the trip as well (but neither my desktop nor laptop have a CD drive, however I have an old Macbook with a CD drive that I have saved just in case).

So, no more finding ancestors' diaries and photos.

My great-grandfather and great-grandmother made a scrapbook of their trip to England in the summer of 1914, just before WWI started. My great-grandmother's parents were German and had emigrated to the US. Oops. Being German-ish in England during WWI, not a great idea. They managed to get some money somehow and got on a boat home, but it wasn't easy. Sure, if this had been 100 years later, a blog, but would it even exist for a great-grandchild to read 100 years after the fact? Our digital world hasn't been around long enough to tell, but, given the rate of digital decay we've seen, the answer is no.

Tuesday, April 11, 2017

Why Isn't Underlining Dead?

I was editing a shared document, and we had a bunch of different header types, and some were in underline style.

I changed them to italics.

Now, the history lesson.

Back in the day of typewriters (IBM Selectric II with correctible ribbon ftw!), you had the keys (same pretty much as computer keyboards today) and each key related to a ... thing. Arm? The typing part that thwacked away onto the ribbon, leaving an imprint on the paper (and if you hit too many keys at the same time, the arms would try to occupy the same space at the same time and they'd get stuck). Oh, "type hammer" or "typebar", well those are ungainly.

Each bar / arm had two possible items it could print -- usually, a lower case and an upper case of the same letter. Hitting the SHIFT key would adjust the mechanism (I don't recall how anymore) so that instead of the lower-case part of the key hitting, the upper-case part would.

So you see what is missing: not just fonts, but italics or even bold.

Underlining was easy though: just back up and hit the underline key. _____ Like that but now on the computer _ is its own character in a way that can't be used to underline letters (other characters).

Underlining was the way you indicated to the typesetter (for printing for real, in a book or journal or magazine or newspaper, that is, on paper) that you wanted italics. Even early on with computers when you could have the computer do italics, often we were still told to use underline and then the typesetter would know what to do, since it was the same old thing. Computers were just fancy typewriters in this way -- but of course, they weren't.

Today we are stuck with so many web pages and word processor programs that insist on using black text and a white background to look like black ink on white paper. It's familiar, we know what it is. But that paradigm has been, not dead, but well yes dead in its solitary focus. We can do black ink on white paper, but we don't have to do black on white in the digital space. The paradigm should be "do what is nice and readable." The Financial Times, which prints on slightly pink paper, brands its website in this way (pinkish background), which is sensible. (Actually it's looking yellow, I thought it looked pink a few months ago.) Microsoft Word Mac used to have a "white text, blue background" option which I loved for many versions starting in at least 5.1, but they recently killed it and we're stuck with the black ink on white paper paradigm.

Maintaining the familiar within the new is a long-used approach and helps adoption of new technologies. "Horseless carriage" was what people called "cars", they knew what horses were and what a carriage was. "Wireless telegraphy" was early (pre-music/voice) radio, as people knew what telegraphy was and what wires were. "iPhone" well no it's not a phone, it's a hand held color wireless computer and communicator, but "iPhone" well people knew what cell phones and iPods were, so...

And thus black digital text on white digital background. And, keeping the computer and its keyboard and the overall gestalt familiar, underlining in the fonts. But we don't need it. It's redundant (it means italics!) and I don't think it looks very good. It is time for underlining to end.

Sunday, November 15, 2015

Facebook, Paris, Beruit

A lot has been written about how Facebook activated its "Safety Check" feature for the recent Paris terrorist attack, but not for one the previous day in Beirut. There is some good commentary, from less well known sites to the bigger news sites to personal blogs.

So what are we left with?

Facebook is a part of the global media fabric as much as any other site, they not only act as gatekeepers for news but act within the news environment and ideas about what constitutes news. Facebook is based in California, and so most of the employees who make up the organization we call Facebook are American: they grew up within a specific media environment which had clear but subtle ideas about what news is. (As an aside, I wonder how many employees at Facebook who are involved in the curation of people's feed have any background in actual journalism.) To understand this framing of what is news and what isn't, we need to understand global history and the flow of information, a flow which often parallels economic flows. So yes, we need to understand words that some framings have determined should make us uneasy, such as imperialism and colonialism. (Think about it this way: in the Babar series of children's books, when Africans wear European clothes they are portrayed as good, but when there are Africans who wear non-European, "traditional", or "pre-contact" clothes [I am not sure what the best term is], they are portrayed as backwards -- yes, they are elephants and rhinoceroses, but they are being used as humans in the stories. This has not gone unnoticed.) Because often parts of the world that were used for colonialism and imperialism, empire building, and the extraction of goods through forced labor and violence, are now the parts that are not worthy of news coverage, although it isn't quite so easy and straightforward.

But what we do have is highly problematic, besides the sadly common lack of coverage in some parts of the world -- the Western news media didn't cover the Beirut attacks very much, neither did Facebook, and both of these non-reactions are for exactly the same reasons which can be couched in economic terms but have deeper cultural and historical roots. Facebook doesn't have as many users, most likely, who are directly connected to Beirut, but it has many more with connections to France. The same is true for their employees. But they are also reacting to what they see in the media and perhaps to trends they are continually monitoring, live, in the overall Facebook environment.

Like the media, Facebook is essentially bestowing the idea of newsworthiness on some issues and also deciding that some other things are not newsworthy at all. That really is a big problem, as it's clear no one there is qualified to do so. This is also a well-known issue more broadly and is not at all new. This is not to say it's good, it's not at all good, nor is it to say Facebook shouldn't have activated the "Safety Check" feature. I appreciated it, as I have friends in Paris.

There are also some technical issues, beyond deciding which events qualify.

For example, for an earthquake, what if I check in as "safe" after the initial earthquake, and then am killed shortly thereafter by an aftershock? (The same issue holds for other kinds of events, such as terrorist attacks.)

Facebook's page about the Safety Check says it's for natural disasters (as of November 15th, 2015), and does not mention other events such as terrorist attacks, nor how any of these will be selected. Yet it was used for an event that was not a natural disaster.

More broadly, it could be argued that being black, female, or GLBT,  in America is to live under constant threat (there are many other examples but I am not qualified to discuss them much, nor can I make an exhaustive list, this is just an example). But, like the framework that silently suggests that Beirut is less coverage worthy than Paris, these issues should be kept quiet.

Facebook has taken action in a very contentious area, one where ideology and hegemony are heavily invested in outcomes and how we think about what is worth thinking about. Yes, as we should expect from most gigantic global companies, they did a bad job. As we know, people have been discussing these issues for a long time. These issues are still issues. Now, more people are talking. That's an important step. Steps are how we move forward.

Addendum: There is also the profile picture change to overlay the French flag on your profile photo, which again is not a bad idea, the problem is still which events are worthy of this level of attention, who is deciding, and how are these decisions made. It's the same problem as that with big data algorithms, except here it's with people's decision making.


Addendum, part 2: Here is an article from The Verge about why the people at Facebook who make these decisions did so, but I personally don't find the official explanation very satisfying about the criteria for their selection of events because the official response avoids all of the difficult issues that most people are talking about. If you're a company like Facebook, you can address these issues in a much better, direct, and clear manner. 

Monday, July 13, 2015

Music in an MMO!

Oh this is exciting, but I don't play each and every MMO (that would be insane) so didn't know about it.

Cheng, W. (2012). Role-Playing toward a Virtual Musical Democracy in The Lord of the Rings Online. Ethnomusicology, 56(1), 31-62.

There's no abstract, but here's an early paragraph:

In an attempt to honor the rich musical lore of Tolkien’s Middle-earth, Turbine implemented in LOTRO one of the most elaborate player-music systems in any MMORPG to date. This system allows a player to perform both live and pre-recorded tunes that can be heard by other nearby players in the gameworld. A player’s musical performance is visually simulated by avataric motions and strings of colorful notes that float out of a character’s equipped instrument (see Figure 2). Examples of such instruments—each of which sports a different synthesized timbre and a range of three chromatic octaves along the Western twelve-tone scale—include the bagpipes, clarinet, flute, horn, cowbell, drums, harp, lute, and theorbo.
The wiki page seems pretty informative, and there's a website repository for the ABC music notation files.

I love it when people make things and share things, and music is community-based, a form of communication, and is very old! The oldest instrument we've found, a bone flute, is about 40,000 years old and it certainly wasn't the first musical instrument, since flutes aren't that easy to make.

Monday, June 1, 2015

400,000 Years Ago

400,000 years -- we're only at 2,015 in the "common era". So roughly 398,000 BCE. The Great Pyramid of Giza is from around 2,560 BCE, and Stonehenge is from between 3,000 BCE and 2,000 BCE. The cave paintings at Lascaux are from approximately 15,000 BCE. The earliest musical instruments found, bone flutes (bone survives well in the archaeological record), are about 35,000 years old, that is, from 33,000 BCE (although there is some disagreement about some finds and dates).

200,000 years ago is when we think the first humans -- homo sapiens -- evolved.

400,000 years ago is when homo heidelbergensis, not us but our ancestors, existed. No humans.

And this is a chipped (napped) stone instrument they made, possibly to cut animals apart. And that's me holding it at the British Museum. HOW COOL IS THAT I'LL TELL YOU IT'S VERY COOL.


Lost Things

From the amazing British Museum, here is a 4-5 foot tall staff that apparently humanity just lost track of despite it being perhaps one thousand years old (or at least 700 years old). "Oh that old thing?"

"It was found behind a London solicitor's cupboard in 1850..." BEHIND A CUPBOARD.



Sunday, June 17, 2012

Incredible! "Deer Antler"

I don't even know what to say. Humor? Museum? Learning to read? Awesome. From the National Museum of Ireland.


Saturday, March 17, 2012

Web Boards as Monuments

Fort Greene Park, Brooklyn.
On the issue of community, Conrad (2005) wrote how "[online] learners described the effects of group work, the frequency of sustained discussion, and the permanence of Web-based texts as powerful community builders."

Web-based texts, like archived forums, as community builders, showing what took place in the past and that yes, it took place, are like big stone monuments we use to mark achievements. 


Conrad, D. (2005). Building and maintaining community in cohort-based online learning. Journal of Distance Education, 20(1), 1-20.

Friday, January 28, 2011

Twitter, Facebook, Revolutions, and Spectacle

I think I should call this post, "Twitter, Revolution, and the Narcissism of Internet Commentators."

Internet commentators, such as myself, seem to have a certain fascination with the relationship between the Internet and political revolution. I think it is that, we derive our power from the power and importance of the Internet: If the Internet is powerful enough to cause a revolution, then we are powerful too.

But as we all know, revolutions do not need the Internet. Historically speaking, the Internet is so new that almost no revolutions can claim it (or, the Internet can't claim them). Revolutions need organization and power, and the Internet can help organize quickly and massively (for either revolutions or flash mobs of people who dance in stores). The American revolution had armies fighting, the French revolution had the Bastille, The Russian revolution had the red army and the white army and the murder of the Tsar and his family, the Chinese communist revolution had the armies of Mao and Chang Kai-shek. None of these had the Internet.

The recent uprising -- a failed revolution? -- in Iran had the Internet, and cell phones, and smart phones, and current communication technologies that allow people to organize and share messages on a massive scale with unprecedented speed. (The telegraph was the first communication technology to separate communication and transportation, and it was an amazing thing.) But the Internet, with its Twitter and its Facebook, did not bring down the regime there. Burma is still Myanmar, China is still slowly destroying Tibet, and there are other examples one could mention, each with its own history, complexities, disagreements, and more than two sides to each one.

Twitter and Facebook share in a criticism that has been made about television: That we watch it, we feel we are part of it, but we are not, we are in fact at home watching television (and many of us do so from our nice, safe, Western nations, vast distances from the real action, yet here it is in our living room on the screen). Twitter and Facebook, the technologies of the moment, are like this but to even more of an extent: We can see the tweets from people who we believe are actually there, on the ground, surrounded by riot police -- as long as they tweet in English, that is, and English is never the language of the nation in question, nor is it ever clear how educated one has to be, or what social class one belongs to, if one tweets in English and its not your native language. Clearly such people are not tweeting for their local audience.

That is not a criticism, however, but it is occasionally overlooked. Knowing your audience is always important, as is knowing about the message sender if you are the audience.

Somewhat akin to slacktivism, I am sure someone has written about this feeling of immediacy we can get from tweets before. I think these communication technologies are important, but their endless hyping stems from a misunderstanding: The current embodiment of digital communication technologies allows us to communicate widely and freely, which is an important change from past communication technologies. But it is not about Twitter, and it is not about Facebook, they are merely the names of the embodiments of the forms we are using today. This is instead a more enduring story, one of people, and how these technologies let us do what we have always done, that is, connect, be it to plan a meeting for coffee or an attempt at revolution.

Edit: This isn't to say we don't care about our fellow human beings, or that we don't want to see what it happening to them. Often we do.


See also this article (the second half mostly) at the New York Times by Scott Shane.

And see this blog post by Rasmus Kleis Nielsen, fellow academic.

Friday, December 17, 2010

When There Is No Digital Sediment

I may have been thinking about the ancient peoples of Europe, or even Neanderthals, at the time, but I realized one thing some virtual worlds lack today is the archaeological record found in the layers of sediment put down by time. There is no digital sediment there.


This was not always so, I don't believe, particularly with MUDs. In the non-virtual world space, with Google and the Wayback Machine, there is plenty of digital sediment. Wikipedia is all digital sediment. In MMOs, I don't think there is any, although I don't know every MMO in this manner -- in EverQuest II, if you stop paying for your house or guild hall, it's off-limits. Any "history" was put down by the game designers and isn't the same as the history of the digitally lived experience, even when that "history" refers to the original EverQuest.

In Second Life, you have to buy space. I am not entirely sure what happens with space that is given up -- I'm under the impression it gets sold, either to another user or a land broker, but either way the digital objects are usually wiped and something new is put in, with no layering on the past, just an erasing. Although it make digital building easier -- and I am a bit hesitant to call it "building" since it has very little to do with building in the physical sense -- it certainly does away with one of the usual conditions of the real world. I imagine if the owner of a zone doesn't want it, and can't sell it, it gets erased. Eventually Second Life will have to shrink, but I don't follow Second Life currently (it's creepy) and I doubt the Lindens will trumpet the loss of "land" like they trumpeted when all the corporations and embassies moved in (when the corporations moved out, it wasn't the Lindens talking about it).

MUDs, since many of them were non-commercial and run on some university server somewhere, just built up stuff over time. There weren't exactly layers, but there were additions and additions and more additions.

Some games use history as a feature. Dwarf Fortress is one; once you dig away the rock in the mountain, it's gone. Sure you can essentially fill it back in, but not with original stone, you have to make floors and walls instead. The history of the space is often there. SimCity, from what I recall, had some features that were easy to undo (like zoning) but other ones that weren't. Overall, though, you could re-build anything in-game, erasing everything before it.

But if we want to deal with virtual worlds as real-ish worlds, or as real worlds but in digital form, we need to remember what qualities digital worlds don't have. History is vitally important to us, but often the digital sediment of the past is missing. It's not presented as a bug, it's presented as a feature.

Saturday, December 4, 2010

Names and History

A nice point by Paul Graham about what we call touchscreen devices like iPhones, which I saw on boingboing (I thought, but can't find it, maybe not) and Daring Fireball:

The only reason we even consider calling them “mobile devices” is that the iPhone preceded the iPad. If the iPad had come first, we wouldn’t think of the iPhone as a phone; we’d think of it as a tablet small enough to hold up to your ear.
This is standard human behavior, we've done it before. We present new things in terms of the more-familiar then-current things. Horses led to horseless carriages, from which we dropped the "horseless" and "-riage" part of carriage to get just car (I believe), which we still drive. There was also the iron horse (the locomotive), and the wireless telegraph (early radio before voice was used).

Tuesday, May 4, 2010

People and Change (Time) Part 2

Connecting with my earlier post on how many people in Detroit resisted accepting the standardization of time (and the move away from standard time), we see the same thing with the change from the Julian calendar (named after Julius Caesar) to the Gregorian calendar which we use currently.


From Paul Strathern's The Medici, Godfathers of the Renaissance, pp. 360-361:

By the turn of the seventeenth century the Renaissance was beginning to make itself felt in a range of increasingly disparate fields. The times were changing, even in the most literal sense: when it was noticed that the seasons were beginning to drift away from their customary positions in the ancient calendar, Pope Gregory VIII abandoned the ancient Julian calendar dating from Julius Caesar in 46 BC, and in 1582 introduced a new Gregorian calendar, at a stroke advancing the date by ten days. Yet many remained highly suspicious of such transformations, and as the new calendar was introduced over the years throughout Europe, it provoked riots, with indignant mobs demanding back the ten days that had been robbed from their lives.
Riots! Wow.

Thursday, December 31, 2009

Tech and People

Aha! This is what I have been saying! Sadly I do not quite get the same coverage as an O'Reilly person. Via boingboing, from Sarah Milstein:
Our human patterns are surprisingly consistent, and technology evolves to meet us...
Technology changes, people don't.

Monday, November 2, 2009

100 Year Old Griefing

Susan Douglas (one of my PhD advisors), in her 1987 book Inventing American Broadcasting, detailed a 1907 New York Times story about a young man who used wireless. Although the precursor to modern radio, it was not at all like today's radio. There were no stations playing music, it was Morse code, and it was all individuals sending and receiving messages. These individuals could be some young guy in New Jersey, or the radio man on a ship, or a commercial "station" of sort (such as a newspaper contacting ships at sea for news), or the navy.

For the most part (probably exclusively), it was young and technologically-savvy men. But, it was also anonymous, since there was no automatic way to identify people. And, when you have a communication technology and anonymity, you have griefing. Douglas wrote how, in about 1910, “deliberate interference... began to get out of control, and to the military, in particular, it ceased to be in any way innocent or amusing.” (p. 207) Congestion of the airwaves, and general interference, was increasing, but so was “malicious interference” (p. 208)

Some amateurs deliberately sent false or obscene messages, especially to the navy. The temptation to indulge in such practical joking was enhanced by the fact that detection was virtually impossible. Amateurs would pretend to be military officials or commercial operators, and they dispatched ships on all sorts of fabricated missions. Navy operators would receive emergency messages about a ship that was sinking off the coast. After hours of searching in vain, the navy would hear the truth: the “foundering” ship had just arrived safely in port. (p. 208)

Sending navy operators “profane messages” was something else that the amateurs did. The navy, trying to assert some control over the airwaves, would issue “statements about the grave danger posed by the amateurs, and cited many instances of unpatriotic interference.” (p. 210)

Much like today, anonymity played a large part in people’s behavior. “The anonymity made possible by wireless had a leveling effect on the status and power of naval officials: in the airwaves, rank was irrelevant; only technical strength mattered.” (p. 210)

Saturday, August 29, 2009

30k Years Ago...

As you can see from the lonely "History" tag, this has nothing to do with technology. Instead, it is about a 30,000 or so year old bear skull, that some pre-historic person placed on a rock in a cave and left there. And there it sat. For a few years. And a few more. A century. One-thousand years. Another thousand years. Ten thousand years. More! Until we modern humans found it, in Chauvet cave in France. (Click on "Visit the Cave", then in the upper right of the map find the green dot that is "The Chamber of the Skull". Click it.)


This is pretty cool. Why? Don't we find old bones, well, not all the time, but, museums are full of them. Yes. But, usually they are in the ground, surrounded by datable strata, or maybe a tar pit, or, like a mammoth, frozen in ice (or like The Ice Man). Granted the bear skull was underground, in the sense that the cave is under the ground, but it was not in the ground, it was sitting on a rock the entire time. Yes we find rock paintings, but those are painted onto rocks. You can't move them.

Typically when we find things that old they are not just sitting there. King Tut's tomb was amazing because it was fairly, but not completely, undisturbed, and is a little over 3,300 years old (so The Ice Man is about 200 years older). Stonehenge is about 4,500 years old. The Egyptian pyramids, which are also stone structures that have been out in the open, are about the same age. But our little skull friend was already ancient when all of those were built.

Sunday, August 16, 2009

People and Change (Time)

From Grandin's Fordlandia, p. 224.

[Ford] was twenty-two when, in 1885, most of Detroit refused to obey a municipal ordinance to promote "the unification of time," as the campaign to get the United States to accept the Greenwich meridian as the universal standard was called. "Considerable confusion" prevailed, according to the Chicago Daily Tribune, as Detroit "showed her usual conservatism in refusing to adopt Standard Time." It took more than two decades to get the city to fully "abandon solar time" and set its clocks back twenty-eight minutes and fifty-one seconds to harmonize with Chicago and the rest of the Midwest (the city would switch to eastern standard time in 1915, both to have more sunlight hours and to synchronize the city's factories with New York banks).
Time is relative! (Yes I mean time of day, not the passing of, but that's relative too.)

Monday, June 8, 2009

The Information Future's Past

Recently, Paul Otlet has gotten some press as a forgotten information future pioneer (from 1934, no less), however, I just discovered that he actually wasn't that forgotten. Going through some articles I had used in my dissertation, I discovered Otlet referenced in a 1994 JASIS article by Donald Case, who is referencing a 1992 article by Buckland. 

Case also references a 1964 article in The Atlantic by Martin Greenberger that you simply must read. And, even better The Atlantic has made it available online. How smart is that? (Very.) A snippet, and remember, this is 1964:
The range of application of the information utility extends well beyond the few possibilities that have been sketched. It includes medical-information systems for hospitals and clinics, centralized traffic control for cities and highways, catalogue shopping from a convenience terminal at home, automatic libraries linked to home and office, integrated management-control systems for companies and factories, teaching consoles in the classroom, research consoles in the laboratory, design consoles in the engineering firm, editing consoles in the publishing office, computerized communities.
Wow! Spot-on. Awesome. Apparently this piece did indeed influence later information society embodiments.

Friday, April 3, 2009

Blast From The Past: 1997, WebTV

12 years certainly does allow for some perspective, although people were insane in 1997 before the dot-con bubble burst (no "con" was not a typo). 


From Microsoft Took WebTV Risk, Despite Loss, by Steve Lohr, May 5, 1997.
WebTV's under-the-hood technology was probably the real lure for Microsoft, says Roger McNamee of Integral Capital Partners, an investment firm. Imagine the day when HDTV, he adds, becomes affordable and popular, with Microsoft charging the manufacturers a license fee of, say, $50 a set for the software that brings the Internet to those souped-up sets.
Well, HDTV is affordable and popular. Microsoft has the Xbox360... Sony's PS3, Apple's Mac TV approach, we have Boxee, Tivo, DVRs, even open source DVRs (Myth). I could go on about other parts of the ecosystem (like, Hulu). The internet is not currently the best delivery system for HDTV (digital cable, satellite, blu-ray...), even though I get both my internet and TV over the same cable.

I hate when analysts say dumb things.