Reflecting Wolfram Alpha
in case you did not have the chance to watch the presentation at the Berkman Center yesterday, Stephan Shankland’s Cnet-article:
â€¢ Data curation. Wolfram Alpha uses public and licensed proprietary data sources, and the company uses automated processes and human choices to prepare the data. “At some point you need a human domain expert in front of it,” Wolfram said.
â€¢ Algorithms. Alpha must pick the right computational processes to present its results. “Inside Wolfram Alpah are 5 million to 6 million lines of Mathematica code that implement all those methods and models,” he said.
â€¢ Linguistic analysis to understand what a person typed. “I thought one of many things that could have gone wrong was that short, lazy things would (have) huge amounts of ambiguity,” for example figuring out whether “50 cent” had to do with musical artists or money. “That turned out to be not nearly as much of a problem as we expected.”
â€¢ Presentation. “There are tens of thousands of possible graphs. What do you want to show people?” Wolfram asked.
Issues that are not clear yet, are (a) does it work in the real world, (b) does it empower us or “the experts,” (c) is it really something new or does google already do it better with trendalizer (or Ralf/Martin with Eyeplorer), and (d) how important is the approach for data management? What do you think?