April 03, 2008
Live Blogging from KSL GIS Symposium--Breakout Session: Andrew Curtis on Yellow Fever
Andrew Curtis, GIS Research Laboratory, Dept. of Geography, University of Southern California
"Using GIS to Reveal Spatial Patterns in the 1878 Yellow Fever Epidemic of New Orleans."
Challenges for health analyses:
Lack of data
Lack of dynamic data (most events vary in both space and time)
He described the GPS/video techniques used for capturing data post-Katrina in the Holy Cross neighborhood of New Orleans.
Showed examples of medical cartography.
Showed contemporary maps from 1897 yellow fever epidemic to show mosquito distribution and cases of yellow fever in New Orleans.
Why study epidemics from the past using GIS?
1. Devlop our understanding of the event itself
2. Use the data to improve our methods of analysis/visualization
3. Look for insights into the spread of the disease.
The disease entered New Orleans almost annually. N.O. was a major trading hub. Survival of infection gave immunity. The 1878 was the most georgraphical devastating. Led to a better quarantine system.
discussion of development of GIS maps useful for the study of the epidemic.
[There was a simultaneous breakout session by Daniel Janies, PhD, Dept. of Biomedical Informatics, Ohio State Univeresity, "Genomic and Georgraphic Analysis of the Evolution and Spread of Infectious Disease"]
Live Blogging from KSL GIS Symposium--Breakout session: Dave Wagner on Anthrax, Plague and Tularemia
Dave Wagner, Center for Microbial Genetics and Genomics, Northern Arizona University
"Using GIS to Understand the Ecology,Dispersal and Evolutionary History of Diseases: Examples using Anthrax, Plague and Tularemia"
Resource focus: potential bioterrorism agents (Anthrax, Plague, Tularemia. i.e., Category A Select Agents)
Dogma is that anthrax was introduced to North America and South America by colonists from Europe. Mr. Wagner's group mapped the genomics, indicating that the anthrax more likely came with humans coming across the land bridge from Asia to Alaska.
Further discussions of spread of plague through prairie-dog colonies in the American West.
[There was a simultaneous breakout session by Nina Lam, Department of Environmental Studies, Louisiana State University, "Reducing Uncertainties in Health Risk Assessment through GIS and Spatial Analysis."]
Live Blogging from KSL GIS Symposium--Breakout session--Peter Tuckel on the Influenza Pandemic of 1918
Peter Tuckel, from the Dept. of Sociology at Hunter College, CUNY.
"The Diffusion of the Influenza Pandemic of 1918 in Hartford, Connecticut"
Hartford was hard hit by the pandemic and had a varied population. (And it was Prof. Tuckel's birthplace.)
Prof. Tuckel explained the methodology and outcomes of the study.
Analysis of death certificates (place of death--home or hospital; age, sex, race, marital status; national origin; undertaker, embalmed or not) of everyone who perished from he disease between 9/1/1918 through 12/31/1918.
A digital street level map of Hartford during the period of the pandemic. He used a 1990 digital street map that was altered to reflect the situation of 1918, not only putting in streets but putting in address ranges. (The address ranges came from a 1918 city directory.) The addresses were significant in order to geocode the places where people died. There was an analysis of the housing stock (single-family dwellings as opposed to multi-unit buildings) He created "sub-ward" profiles by ethnic group. Southern/Eastern European ethnic groups had higher death rates; native-born had a higher death rate if they lived in an immigrant neighborhood.
For each victim they assigned a numeric code relative to the date of their death. (Sept 1 = 1; Dec. 31 = 122). By means of assigning these values he was able to map the progress of the pandemic. Death rate was highest where it struck earliest, but also had the shortest duration of time, and manifested itself in immigrant neighborhoods. "It decimated everyone in a short amount of time in those neighborhoods."
Instead of viewing the epidemic as a solitary event, one can better understand it as a set of somewhat discrete events or "mini-epidemics." Native-born or "older immigrants" were less susceptible. It ran its course much more rapidly in congested poor areas than in more sparsely populated, affluent areas.
[There was a simultaneous session presented by Shubhayu Saha, Dept. of Forestry and Environmental Resources, North Carolina State University: "Minerals, Forest and Health: Does Resource Extraction undermine Human Development?"]
Live Blogging from KSL GIS Symposium--Lunch break and poster sessions
We're now on lunch break until 1:15, with poster sessions spread around the 2nd floor of KSL. Although lunch is limited to symposium registrants, the posters are available for all to view.
Back for the afternoon, which consists of several break-out sessions.
Technorati Tags: GIS
Live Blogging from KSL GIS Symposium--Plenary Session #2 Uriel Kitron
Uriel Kitron, Dept. of Environmental Studies, Emory University
"West Nile Virus in Chicago: Considering the past, Understanding the present, Predicting the Future"
Prof. Kitron pointed out the importance of libraries as centers for providing geospatial data.
He spoke of the role of GIS, remote sensing and spatial analysis in VBD research. Scale of considerable importance. Spatial (village/town, continental) and temporal scales (seasons, years, decades) and the resolution of the scales. Can be considered simultaneously as well as consecutively.
West Nile Virus appeared in NYC during 1999 (from the Old World)
in 2002 it appeared in Chicago and surroundings. The virus has moved very quickly across the United States.
Prof. Kitron explained in detail factors related to the spread of West Nile Virus in the Chicago area and relationships to earlier infections of other diseases in some of the same geographical areas. One factor seems to relate to many undocumented storm drains filled with water, organic waste, making an excellent breeding place for the mosquito larvae that spread WNV. If the drains are flushed regularly with frequent rains the larvae are likely to be washed away and there is less problem.
Prof. Kitron's future research is to investigate the fundamental ecological proeceses that drive the fine-scale variations in WNV transmission; focus on fine-scale spatial relationships for transmissions.
Live Blogging from KSL GIS Symposium--Plenary Session #1 Charles H. King
Joseph Koonce, CWRU biology professor, introduced Charles H. King, MD, from the Center for Global Health and Diseases and Dept. of Epidemiology & Biostatistics, at Case Western Reserve University.
"Microsope to Macroscope - Using GIS to Understand Environmental Complexity in Dease Causation."
* Background on the philosophy of medical science
* Resistance to GIS
* Complexity and theory of environmental analysis
* GIS and the new practice of "eco-social epidemiology."
Since the 1700s the microscope has been the rationalist/positivist tool of choice; reductionistic; germ theory was a major breakthrough; now molecular medicine.
What is wrong with environmental studies in medicine: It's messy; It's too complicated; it's difficulty to isolate cause and effect; it tells us things we don't want to hear. Complexity reigns in our political world. Genetics, exposure and environment all relate to infection of disease.
Radomized control trials do not reflect the real world: "Why don't patients get getter on 'proven' regimens?'" There is "hidden stratification" in samples that will end up with unpredictable results.
Complex systems organize themselves into predictable but chaotic-appearing patterns.
What are our health research goals? Explanatory; Predictive (past performance does not necessarily predict future performance)
He discussed his own research about transmission of a parasite spread through water and snails in Africa. He described his use of GIS for analysis/data mining. Use of remote sensing and satellite imaging for mapping, creation of spatial data. Data must be confirmed on the ground with GPS data. Data is correlative, not causative.
Other dimensions such as poverty and socioeconomic factors play a role. We cannot ignore the context.
Why do we do this research? We want to be able to use all of these factors to analyze how they combine to foster disease.
Opportunities for GIS:
* New impetus for ecological research
* Comprehensive multi-scale picture of local/regional/global epidemiology
* Consideration of temporal changes
* Integration of molecular data with environmental.
Spatial data, concerns and limitations:
* massive amounts of data,
* but paucity of of accurate epidemiological data
* lack of data is readily masked in maps
* meaning of area boundaries is importans--show what you don't know.
* There are limites to the use of spatial auto correlation and interpolation
Q. How do you obtain data, and how much data is available?
A. "It's all over the place" Rainfall data for the city is discarded every day; some weather data is retained, but it may not be what is needed. It may be necessary to set up your own sensors, such as what Prof. King did in Africa.
Q. Comment: a focus on complex ecology is important. Multidisciplinary research (ecology + disease) is a problem for NSF and NIH--there is one joint panel to handle such applications.
Q. Can mapping move to prediction?
A. Some generalities can be made for some cases, but in most instances there is not enough data to make predictions (e.g. West Nile virus).
Live Blogging from KSL GIS Symposium--Clifford Lynch keynote address
Cliff Lynch, Executive Director of the Coalition for Networked Information, spoke about the notion of cyberinfrastructure as related to scientific inquiry:
high performance computing; sensors connected to the Network; very large data sets; virtual organizations. These concepts are known collectively as e-science, especially in Europe. In the U.S. the same concepts are known (in Lynch's view, somewhat perversely) as cyberinfrastructure. The NSF is the guiding body for these concepts in its Office of Cyberinfrastructure, headed by Dan Adkins.
He discussed simulation as a fundamental tool for science, which some argue is a topic that should be taught broadly to undergraduate students. Examples: disaster planning based on certain characteristics (time of day, spring break, on a bridge); simulating early agrarian societies;
Sensors: use of very tiny sensors--"smart dust"--that can be "spread around" to gather data. Ecologists and environmentalists using "dumb sensors" that can sense only a few kinds of phenomena, but then overlaid with a system of "mobile sensors" that can move to a place that indicates the need for a more sophisticated gathering of data. Social elements: closed circuit TVs; monitoring highways; cell phones that know where you are and are now starting to have other kinds of sensors--with certain kinds of sensors built in, it would be easy to build a ubiquitous national sensing network. Much social, commercial and societal activity has been moved to the Internet that can now be monitored and tracked. It creates an enormous social sensor network that we have not had before. He described his impression that the major software firms (Google, Yahoo, Microsoft) are very protective of the privacy of their users for good business reasons; if they let the data be "non-anonymized" they would not be able to gather it.
Data: It is a fundamental cornerstone of e-science. Not just preservation, but data curation. Notion that you are not just keeping data for altruistic purposes, but to be able to do new scholarship and repurpose. Not just for the sake of creating archives, but for the possibility of creating downstream new knowledge. Examples: meta-analysis across separate data sources, especially using diverse data sources removed from their original purposes. Preservation of data costs money, and we don't necessarily want to preserve everything. There is still a strong bias in sciences to preserve as little data as possible, creating a "nightmare scenario" which requires future funding to preserve old data. We are just beginning to have a language to discuss preservation of data: "data curation." "Data scientists" is a new breed of person starting coming out of schools of information science. Most projects will not be able to support large-scale data curation staff. There are some scientific areas in which it seems an unsolvable long-term problem (e.g. high energy physics). Data also needs to be collected in some sort of context. Once it is packaged, some entity needs to take responsibility for managing the data in the long term. It is part of fundamental scientific results:
*disciplinary repository (e.g. molecular biology) with norms for collection and sharing; some agencies are now demanding pre-publication data sharing; who pays for the repository
*journal publishers: "give us the whole package": article, data, computer programs. Sometimes this is in reaction to academic fraud cases. The journals are quite vague about who has the long term responsibility for these "supplemental materials." How much supplemental data can you give them?
*universities themselves who host the research, especially through the university library. Serious financial issues for the university/library who undertakes these efforts. It is a big expansion of role for the library. Also need to deal with area of duplication of effort by spreading areas of expertise for academic disciplines among fewer institutions.
Lynch pointed out that his use of "e-science" can more correctly be termed as "e-research" since the same techniques of for research can be used in the humanities and social sciences. Humanities are beginning to generate large data sets that will also need to be preserved and curated.
QUESTIONS FROM THE AUDIENCE:
Q. Product liability for information? e.g. faulty sensors and data that causes catastrophe. Pharmaceutical trials in which data may have been suppressed.
A. Lynch sees that this is a serious problem that will get worse, especially from corporate lawyers who want to have data destroyed as soon as possible, because old data is "pure liability". What constitutes the material that is used for peer review? (Article? Data? Computer programs?)
Q. Comment on data storage.
A. For most data, raw costs of storing are not very significant. Human-produced data (writing, speaking, video) is now "not that big a deal." Getting rid of data will be done for other social purposes, not for costs. Historians now do not have enough hours to review the entire human record, so data mining becomes essential.
Q. Will there be improvement of metadata?
A. Deposited data should be streamlined as much as possible, and manage the metadata in other ways.
Live Blogging from KSL GIS Symposium
I am reporting live today from the Kelvin Smith Library biennial symposium GIS Technology: Sustaining the Future, Understanding the Past. The symposium, funded by an anonymous donor is taking place today on the second floor of KSL. The topic this year relates to the use of GIS with the spread of disease, pandemic, and the effects of environmental change on the disease.
Lynn Singer, Deputy Provost of Case Western Reserve University, welcomed the 100 attendees, pointing out the ongoing nature of the university's pandemic planning efforts.
Joanne Eustis, University Librarian, thanked the planning committee for the symposium and introduced the keynote speaker, Clifford Lynch, Executive Director of the Coalition for Networked Information. Summary of his talk to come.
December 28, 2007
Ask and you shall receive: Warner Music drops DRM
Only yesterday I was complaining that Warner Music was still using Digital Rights Management protections on its music downloads sold through iTunes. Today's New York Times has a report that Warner will begin to sell DRM-free downloads on Amazon.com's mp3 download site. Great news! Now it's up to Sony/BMG to step to the plate.
December 27, 2007
Chandos to sell DRM-free mp3s on Amazon.com
I was browsing through the growing mp3 download site on amazon.com the other day, looking at the upcoming new releases, and I discovered that beginning on January 8th, the entire Chandos catalog will be available as DRM-free downloads on Amazon. For several months Chandos has been selling songs on iTunes with DRM, but for more than a year they have sold them on their own download site without DRM. The only problem with that is that it is a U.K. site and charges only in GBP, so U.S. buyers pay a rather stiff premium for the privilege of a low US$ value, plus a currency exchange fee on your credit card. Chandos has a vast collection of all kinds of (classical) music, but it is especially strong in British music.
The download scene has now reached the point where if I can buy something DRM-free from another source than iTunes, that's what I'm doing. I'm still waiting for Warner Music (with its Nonesuch label) and Sony BMG to get with the program, especially for classical music.
November 29, 2007
LC to begin converting images to JPEG 2000
An article at Libraryjournal.com describes a new collaboration between the Library of Congress and Xerox to convert as many as a million digital images, photographs and maps from the TIFF format to the neweer JPEG 200 format. The goal is to create a "leaner, faster" digital collection. Xerox will create a set of guidelines for converting TIFF files to JPEG 2000 which will then be turned over to LC and to the National Digital Information Infrastructure and Preservation Program (NDIIPP)
Digital Case, the digital library and repository for Case Western Reserve University, uses JPEG 2000 for delivery of images stored in Digital Case. In a fine case of hedging out bets, we continue to store archival images in TIFF format. The outcome of the LC project could help us determine if the time is right for future projects to create the original images in JPEG 2000 rather than in TIFF. JPEG 2000 is a "loss-less" format, as is TIFF.
November 28, 2007
NPR story about online courses
NPR's "Morning Edition" show this morning had a feature about online courses and how faculty cope with the differences between teaching "live in person" and online. Not a word, however, about how schools offering online courses--in this case the University of Illinois-Springfield--make library resources available to their far-flung students.
May 23, 2007
Ann Vander Schrier participates in "Taking Science to the Streets"
KSL's GIS Manager Ann Vander Schrier will participate in "Taking Science to the Streets" on Monday, June 11, from 6-8 PM at the Great Lakes Brewery. She will be presenting with Prof. Mark Salling from Cleveland State University on the topic of "Google Earth and the Traveling Salesman: The Science of Global Information Systems (GIS)". There are more details here.
April 09, 2007
Adobe Day at KSL on Wednesday, April 11, co-sponsored by KSL and ITS
Please join the Adobe Education Team on Wednesday 4/11 in the Kelvin Smith Library Dampeer Room for the following sessions:
Adobe Acrobat Professional
8.30-10am and 3.30-5pm
Adobe Acrobat Professional-Communicate and collaborate with the essential PDF solution-
The Adobe Education team will be providing an overview on Acrobat Professional software enables education professionals to reliably create, combine, and control Adobe PDF documents for easy, more secure distribution, collaboration, and data collection.
Why Acrobat Professional?
Easily create Adobe PDF documents from Microsoft Office, Outlook, Internet Explorer, Project, Visio, Access, Publisher, AutoCAD®, Lotus Notes, or any application that prints.
Combine documents, drawings, and rich media content into a single, polished Adobe PDF document. Optimize file size, and arrange files in any order regardless of file type, dimensions, or orientation.
Enable users of Adobe Reader® software (version 7.0 or 8) to participate in shared reviews. Use the Start Meeting button to collaborate in real-time with the new Adobe Acrobat Connect line of products.
Easily collect and distribute forms, combine collected forms into a searchable, sortable PDF package, and export collected data into a spreadsheet. (Windows® only)
Control access to and use of Adobe PDF documents, assign digital rights, and maintain document integrity.
ADOBE CREATIVE SUITE 3
10.30AM -12PM AND 1.30-3PM
The Adobe Education Team will be providing an introduction on the Adobe Creative Suite 3 solution. Adobe® Creative Suite® 3 software combines shared productivity features such as visual asset management and access to useful online services with essential creative tools that let you design content for print, the web, film and video, and mobile devices.
The Adobe® Creative® Suite 3 family offers you choice — in the combination of creative tools you master, the design disciplines you explore, and the richness and scope of content you create. This revolutionary new release includes six editions, each combining tightly integrated, industry-leading components that enable you to handle virtually any creative task.
Together these six editions of Creative Suite 3 address virtually every creative discipline and empower you to work more efficiently with your creative team; collaborate more closely with developers to produce engaging experiences; and serve your clients, your business, and your creative vision more easily and effectively than ever before.
Adobe Creative Suite 3 Design Premium delivers a dream toolkit for print, web, interactive, and mobile design.
Adobe Creative Suite 3 Design Standard focuses on professional print design.
Adobe Creative Suite 3 Web Premium combines the best-of-the-best web design and development tools.
Adobe Creative Suite 3 Web Standard serves the professional web developer.
Adobe Creative Suite 3 Production Premium is a complete post-production solution for video professionals.
Adobe Creative Suite 3 Master Collection enables you to design across media — print, web, interactive, mobile, video, and film — in the most comprehensive creative environment ever produced.
August 31, 2006
Google Books now is allowing free downloads of public domain books
Yesterday Google announced that it is now possible to download full-text PDF versions of books in the public domain (i.e., generally published prior to 1923) from its site. The Boston Globe has a good introductory article to the new service. (Titles still under copyright have been scanned; however, Google only makes available small "snippets" of the text, with an invitation to buy the book from several online retailers.)
I tried it out yesterday. The books are image-based PDFs and do not have OCR full-text attached to the PDFs--therefore it is impossible to do full text searching in the downloaded file. The files are relatively small. (Lewis Carroll's Through the Looking Glass was about 4.7 MB, so it was vey speedy over Case's network.) It is possible to do full-text searching on Google's site. Since I have access to Acrobat Professional, I tried running the OCR on the copy that I downloaded. It appears that Google has down-sampled the images used to create the downloadable PDFs to a quite low resolution in order to make the PDF files small enough for convenient download. The result is that Acrobat's built-in OCR does not work well. About only half or less of the text was able to be OCR'd, which makes it pretty useless for searching, since you would never know whether a search term was really there or not. The other issue with the PDF and OCR is that since these scans are taken from library copies, all with a certain "patina of use," there are many "non-print" artifacts in the scanned images (I especially liked the one that appeared to be someone's use of a crayon for highlighting in the text.) There are also various readers' marginal notes. All these artifacts confuse the OCR.
So if you're looking for an old book just for reading online or perhaps printing to your own printer, the Google Books PDFs may be for you. If what you need is searchable text of public domain books, you're better off to stick with the text-only versions at Project Gutenberg. Still Google's project is a triumph of making content freely available. It also frees other libraries to spend precious digitization resources on unique materials that will add to the world digital library.
March 03, 2006
Ageism in technology? Yes. For good reason? Hmmm.... maybe.
The March 2006 issue of Wired magazine has an article about Sky Dayton new venture to bring state-of-the-art mobile phone technology to the U.S. by means of a mobile virtual network operator (MVNO) called Helio. Sky Dayton, you'll recall, is the guy who founded Earthlink. He's now the ripe old age of 34, quite ancient in technology terms. An MVNO is a company that sells mobile phones and service plans, but doesn't own any network infrastructure; rather an MVNO leases infrastructure and capacity from other companies that do. Examples of MVNO are Virgin Mobile and Mobile ESPN; even the bottom of the rack Seven-Eleven wireless plans are MVNO.
Dayton is annoyed that cell phone technology in the United States is a backwater. It is only necessary to look at Korea and Japan and the Scandinavian countries to find much more imaginative use of high-speed cellular technology. Helio's goal is to bring that technology to the U.S. by teaming up with SK Telecom, Korea's leading wireless carrier. They will be buying network capacity from Verizon and Sprint. They are also setting up a partnership with myspace.com.
All this was interesting in and of itself, but what struck me was the description of the marketing research that the Helio consultants have done to drill in on their core potential customers--18- 32-year-olds with money to burn. They broke down the 61 million 18-32 age group into 8 groups, five of which would not be interested in the Helio products, but 3 groups would be: the "spoil me", the "see me", the "feed me" crowd (read the article for the whole explanation.)
I have to say that I found it rather distasteful, all the while knowing that this is exactly how companies figure out how to make money. The clear implication of this planning is that persons of my generation (baby boomer) would not have interest in having a wireless phone that was able to receive video news or TV or streamed music. (I find it more than a little odd, since I probably have more disposable income than do many young professionals who still have student loans to pay off.) It is true that I am not interested in having music videos from the latest bands, nor am I interested in the offerings of myspace.com. But maybe I am interested in video newsfeeds from CNN and the offerings of gather.com, the new web site sponsored by partly by NPR for the intellectual crowd.
For tech companies there will always be an unending supply of 18-32-year-olds. But what happens when the current crop gets out of the targeted marketing demographic? Can we expect companies to abandon them, or will we start to see a broader range of services offered to a wider demographic over time? I may be antique by the measurement technology time, but it doesn't mean that I'm not interested and that I don't have money to spend.
February 24, 2006
Happy birthday, Steve
Today is Steve Jobs's 51st birthday. People who know me know that I am a big fan of Apple Computer products. (my first Mac was the 128K original model in 1984, and I've had one of most models in between--even some of the dogs in the late '80s. I currently personally use most often a 12" G4 powerbook, and I have 3 various iPods in rotation.) But yesterday Apple announced a milestone that may eclipse all of Steve Jobs's accomplishments to date: the sale of the billionth legally downloaded song from the iTunes store. (The winner of the billionth song will receive a 20" iMac, 10 iPods, a $10,000 iTunes store gift card, and--with a kind of Donald-Trump-ish "class"--a scholarship will be established in honor of the event at the Juilliard School in New York.)
Apple is the company who accomplished what many said could not be done--getting people to pay willingly for songs to download them legally. The simplicity of the $.99 per song scheme and the relatively liberal rights for consumers to use the songs on multiple computers and burn them to CDs has been part of the success. It is, of course, all a ploy to get people to buy iPods, since the per-song profit margin to Apple for downloads is miniscule. But Apple's success in the market does also set the bar high for potential competitors (Amazon.com being the latest to announce their entry into the market.) A billion songs is a lot of content for people to abandon to move to a different--and incompatible--format. One might hope that Apple would open its digital rights management software to other players in the market. Amazon's scheme of making digital downloads available immediately for CDs that you buy online in intriguing. This week alone I have purchased three CDs and immediately converted them to play on my iPod or my digital music server at home, bypassing the CD player entirely. (To be honest, I don't remember the last time I actually played a CD--I prefer being able to put together diverse song lists with my Squeezebox device at home.) As someone who is mostly interested in classical music (although I am also known to download trashy dance music as well), I don't usually download "by the song", but still more by complete albums. (Since symphonies and operas are multi-movement works, in most cases, it's cheaper to buy the whole thing than to just download a few segments.) The availability of downloadable music has not diminished my purchasing of CDs--sometimes I just want the liner notes, which I can't get with online downloads.
The landscape continues to change rapidly, and undoubtedly Apple's dominance will be challenged. Others want a piece of the action. But if Steve Jobs's previous actions warrant any prediction of the future, he has more plans up his sleeve to keep things interesting--and to keep my buying Apple computers.
October 13, 2005
The new video iPod offers possibilities besides Desperate Housewives
Apple Computer yesterday announced a new iPod with video capabilities, along with a new version of its iTunes software and music videos and TV shows (notably ABC's Desperate Housewives and Lost) downloadable from the iTunes store. Presumably other shows (maybe full-length movies?) will also become available in the future.
Although Apple is undoubtedly in it to sell not only new iPods, but also the media, the new machine raises other intriguing possibilities. There are already other portable media devices in the marketplace; however, Apple's entry into this market, with the seemingly unstoppable iPod franchise, will carry much more clout and has the potential to make a serious impact into making video a reasonable portable possibility. (Since the video functionality in the new iPod comes with the device, but is "just there" without the user having to do anything, when the user is ready to do video, s/he can do it without a hassle, using the same iTunes syncing that is used for music and podcasts. This seamless integration should not be underestimated.)
One of the new capabilities is that by using Quicktime Pro, it is possible to download your own movies, to iTunes and to iPod. This means, for instance, that a professor (or Cases's ITAC) could make the mediavision courses available for download. Other media clips could me made available as well. We are barely scratching the surface of what could be done in the educational arena with the use of multimedia. It will be very interesting to watch this new little device and see how it plays out. (No pun intended.)
GIS Conference at KSL Today and Tomorrow
Beginning at 1:00 PM today, Thursday, October 13th, and running through early afternoon on Friday, Kelvin Smith Library will host its GIS Conference 2005: Sustaining the Future and Understanding the Past. The conference is free and open to the entire Case community. See the web site for the schedule and more details.
The speakers will include distinguished scholars such as Gregory Crane, Professor of Classics at Tufts University and Editor-in-Chief of the Perseus Project, and Jeanette Zerneke, from the Department of International and Area Studies at the UC Berkeley. Ms. Zerneke will speak about "Dynamic Maps and Cultural Atlases, from the Silk Road to North American Missions."
Other sessions will focus on immigration and neighborhoods, remote sensing data collection, Open Source GIS software vs. commercial software, and many others.
September 29, 2005
More on Wikipedia
The New York Times web site today published a CNET news piece about the online encyclopedia Wikipedia. An Esquire magazine author was writing an article explaining Wikipedia, so in order to show how it worked, he published a rough draft, full of errors and typos, on Wikipedia, and let the wiki masses have at it. According the article on the Times site, the draft Esquire article was edited 224 times in the first 24 hours after it was posted and 149 times in the next 24 hours. The Esquire author's idea was to publish the "original version" and the edited version both in the magazine.
This particular episode was something of a parlor trick (by the author's own admission); however, it does demonstrate the effectiveness of the wiki/open source movement for certain kinds of things, with certain kinds of controls. Seems like a PhD thesis waiting to be written--the effectiveness of online editing in the wiki environment.
Technorati Tags: wikipedia
September 20, 2005
The New Yorker on DVD
NPR's Morning Edition today had a feature about an interesting publishing venture: every issue of The New Yorker back to its beginning in 1925 on eight DVDs. The entire content of each issue will be presented in context, as it appeared on the page of the printed issue, complete with cartoons and advertising. Here's the link to The New Yorker's own online ad for the DVD publication. The $63 that amazon.com is charging for the DVD collection is close to what one would pay for a current yearly subscription.
Libraries, of course, have been subscribing to electronic versions of journals for use by their customers for years, first on CD-ROM, then online. But this is one of the first electronic journal publications directed primarily at the end-consumer. (National Geographic is the other notable example, but their publication is now several years old and technology has become exponentially more sophisticated.) From a technology standpoint it can and will become problematic for those of us who are inclined to purchase these DVDs. It is the eternal question: what will I do when DVD is no longer the technology of choice. I will have eight lovely New Yorker coasters, and I'll purchase the new format. (Perhaps by that time, however, I'll be so out of it that I won't care about The New Yorker anymore.)
July 14, 2005
Internet Archive sued
The Internet Archive and its front end, the Wayback Machine, are the targets of a lawsuit, the New York Times reports today, by Healthcare Advocates. The other defendant in the lawsuit is a Philadelphia law firm that used the Wayback Machine to gather information about Healthcare Advocates in preparing a previous case.
Healthcare Advocates claims that the law firm and the Internet Archive have infringed on the copyright of their web pages, which are stored in the Internet Archive's database, as well as violated the Digital Millennium Copyright Act. A further issue is that Healthcare Advocates' web site had used the technique of "robots.txt" to prevent the Internet Archive's web-crawler bot from gathering displaying information it had gathered from the company's web site. The Internet Archive's bot respected the robots.txt command (although that is purely voluntary, and there would have been no legal requirement to do so), but it was still possible to search and find some of the pages that Healthcare Advocates had hoped to suppress by means of the robots.txt command.
Legal experts quoted in the article claim that it will be very difficult to prove violations of the copyright act. Brewster Kahle, the founder of the Internet Archive had no comment.
July 07, 2005
Just as you dress for success, you should manage your gadgets for success
Today's New York Times has an article about cell phone etiquette in the work place: stories about people interrupting business meetings to take calls from their children arranging meal choices; seminar leaders stopping their sessions to take calls; patients taking calls while the doctor is in the office waiting for them. We've all heard the stories, and witnessed them, and perhaps even participated in such calls.
I have long wondered when we made the transition to the seeming requirement that we must be personally and professionally available at all times. Are we so removed from personal interaction that we need our mobile phones to make others think that we are important? Thankfully, Case has not been infected by the Blackberry (aka "Crackberry" in some circles for its addictive qualities) mania that has infected many business and government offices, which allows meeting attendees to ignore the meeting and read and send emails, not to mention a host of other antisocial behavior. There are stories about government workers flirting in noisy bars via their Blackberries.
Just as loud personal phone calls in the office disturb colleagues, one wonders about the lack of courtesy that people exhibit with regard to their colleagues and fellow travelers. Do I really want to know the intimate details of someone's love life or divorce proceedings? (No, I don't, although I've heard--unsolicited--both.) Then there is the noisy doc in the locker room at 1-2-1 Fitness Center who talks about his patients' diagnoses for all to hear. (He's usually in another aisle of lockers, so maybe it's a case of "I can't see you, so you can't hear me.")
Many libraries have banned mobile phones altogether. (I visited the New York Public Library main reading room in January and at the entrance, there is a huge standing sign of a cellphone with the international red slash "forbidden" mark through it. It's a little hard to miss their point.) The Kelvin Smith Library has articulated a policy of allowing mobile phones only in the lobby area outside the security gates of the library. The policy is widely ignored, although it does seem to have lessened the noise of ringers. More to the point, the policy empowers library users to ask others to stop using their phones, if the users find such use disturbing. There is a second reason NOT to use your mobile in the atrium and main staircase area is that the atrium acts like a gigantic megaphone, and everything you say on the stairs can be heard on all of the landings throughout the building. (Likewise the advice applies to conversations as well.)
June 24, 2005
New ALA report: Nearly all libraries offer free internet access
The American Library Association has released a new report, reported in today's New York Times, that finds that 98.9% of all libraries offer free internet access to their users.
The study also reported that almost 40 percent of public libraries filter public Internet access to prevent minors from gaining access to sexually related materials.
June 21, 2005
A Failed Wiki Experiment
Today's New York Times has an article about an experiment by the Los Angeles Times using Wiki technology to let readers alter a 1,000-word editorial, in much the same way as the Wikipedia allows alterations in articles.
After Slashdot.org posted a link to the "Wikitorial," the LA Times started finding pornography posted, and the paper was forced to take the site down after a day and a half, early on Sunday morning. Here is the LA Times story on the episode. The paper expresses hope that some version of the feature may be tried again in the future. Too bad that "juveniles" (in emotional maturity, if not age) had to screw it up for others.
June 01, 2005
Reviewed in the NY Times: A blog worth checking out
There are zillions of blogs out there and most of them (including, perhaps, this one) aren't worth the time it takes to open the URL to read them. Yesterday's New York Times had a review (yes a review of a blog!) of something very different. It is called PostSecret. PostSecret is described "as an ongoing community art project where people mail in their secrets anonymously on one side of a homemade [4 x 6 inch] postcard." The postcards are then scanned and, and the images are posted to the blog.
The secrets run from the trivial ("I save all the staples I pull out of papers at work. They're in a box on my desk and weigh a pound and a half." or "I leave poetry in library books.") to the horrific (Caption on a postcard showing a photo of the burning twin towers on 9/11: "He should have been at work that day. I wish he had been." or "My father was jailed for rape and molestation of his girlfriend's daughters. He's been there several years I've always suspected he molested me as well. But I've never said anything and I'm scared to find out if my suspicions are true. I'm not sure if my farther is the imprisoned one, or if the one imprisoned is me.")
The site doesn't have any fancy features, just the postcards, which you could spend hours reading. (It is so compelling that I have to pull myself away from it.) This is a true "digital exhibit" in which the art and the anonymous artists speak for themselves.
April 25, 2005
NPR story about testing for technology competence
NPR's Morning Edition show today featured a story about a new standardized test created by the makers of the SAT to be used to test for computer literacy. Several students were interviewed; one stated that "everyone in my generation is expected to know how to use computers and make Power Point presentations, but not all of have those skills."
The story was a bit muddled from a librarian's standpoint, because the issues of computer literacy and information literacy were compared interchangeably, which they are not, in my mind. Computer literacy relates to specific technical skills of using computer technology and software; information literacy refers to the ability to make judgments about the sources and quality of information resources. One can have outstanding computer literacy skills while being information illiterate. One student interviewed for the NPR story described his primary source of information as Google, and that his professors "didn't care" about the quality of his information, as along as he cited it correctly. The NPR reporter went on to say that the faculty of this student's institution did, in fact, care a great deal about the information, but was at a loss to know how to bring these skills to the students.
KSL librarians are trying to work closely with faculty (particularly those involved with SAGES) to bring these information literacy skills to undergraduate students. Click here for a current project sponsored by KSL relating to information literacy. (Maybe win an iPod Shuffle too!)
April 22, 2005
An Interesting Article About the Problems of Digitizing
I am currently pushing on my colleagues at the library a very interesting article by Richard Preston that appeared in the April 11, 2005, issue of The New Yorker about the digitization of "The Hunt of the Unicorn" tapestries from about 1500 that hang in The Cloisters, the branch of the Metropolitan Museum of Art in New York that displays medieval art.
In the process of restoring the tapestries, the museum decided to digitize them, both front and back, so that there would be a permanent digital record preserved in case a catastrophe caused the destruction of the artifacts themselves. The museum hired very reputable consultants and photographers to make very high resolution images of the tapestries. The images were made in segments, with the idea that they could be "stitched together" into a seamless single image using Adobe Photoshop software. But when the museum staff tried to do so, they found that the images were far too large (filling more than two hundred data CDs) and complex to manage.
The museum then turned to two mathematician brothers, Gregory and David Chudnovsky. The brothers are number theorists and built their own supercomputer out of mail order parts. Their previous claim to fame had been to using their homemade supercomputer to calculate the humber pi to beyond two billion decimal places. The brothers thought that assembling the images would be a piece of cake.
But when they made their first attempt, it failed, and the Chudnovskys had no idea why. Upon further investigation about the process used to digitize the tapestries, it was discovered that they had changed shape very subtly while lying on the conservation lab floor being photographed. The Chudnovskys realized that they were working with an image of a three-dimensional structure. This required the brothers that they would need to recalculate every pixel of every image in order to make the image of the tapestries and to correct for subtle differences in color that had occurred during digitization. It was a series of computations, taking three months with their supercomputer, comparable to that of DNA sequencing. After the computations were completed, the final assembly of the first image took twenty-four hours of supercomputer time.
At the end of the article, Richard Preston, the author, describes revisiting the restored and re-hung tapestries at The Cloisters. Despite the phenomenal technological feats that had been used to create the digital image, the real things were still vastly superior, "full of velvety pools and shimmering surfaces, alive with color and detail. ... In comparison, the digital images, good and accurate as they were, had seemed flat. They had not captured the translucent landscape of the Unicorn tapestries."