- Top 10 Recession-Proof IT Jobs
- 7 Hot IT Jobs That Will Land You a Higher Salary
- Link Building Strategies and Tips for 2014
- Top 10 Accessories for Your iPad Air
IDG News Service - Users of NoSQL databases and data processing frameworks such as CouchDB and Hadoop are deploying these new technologies for their speed, scalability and flexibility, judging from a number of sessions at the NoSQL Now conference being held this week in San Jose, Calif.
EMC is using a mixture of traditional databases and newfangled NoSQL data stores to analyze public perception of the company and its products, explained Subramanian Kartik, distinguished EMC engineer, during one talk.
MORE FROM CONFERENCE: 'NewSQL' could combine the best of SQL and NoSQL
The process, called sentiment analysis, involves scanning hundreds of technology blogs, finding mentions of EMC and its products, and assessing if the references are positive or negative, using words in the text.
To execute the analysis, EMC gathers the full text of all the blog and Web pages mentioning EMC, and compiles them into a version of MapReduce running on its Greenplum data analysis platform. It then uses Hadoop to weed out the Web markup code and non-essential words, which slims the data set considerably. It then passes the word lists into SQL-based databases, where a more thorough quantitative analysis is done.
The NoSQL technologies are useful in summarizing a huge data set, while SQL can then be used for a more detailed analysis, Kartik said, adding that this hybrid approach can be applied to many other areas of analysis as well.
"There is all sorts of information out there, and at some point you will have to go through tokenizing, parsing and natural language processing. The way to get to any meaningful quantitative measures of this data is to put it in an environment you know can manipulate it well, in a SQL environment," Kartik said.
For digital media company AOL, NoSQL products provide speed and volume that would not be possible using traditional relational databases.
The company uses Hadoop and the CouchDB NoSQL database to run its ad targeting operations, said Matt Ingenthron, manager of community relations for Couchbase, during another talk.
AOL has developed a system that can pick out a set of targeted ads for each time a user opens an AOL page. What ads are chosen can be based on the data that AOL has on the user, along with algorithmic guesses about what ads would be most of interest to that user. The process must be executed within about 40 milliseconds.
Source data is voluminous. Logs are kept on all users' actions on every server. They must be parsed and reassembled to build a profile of each user. The ad brokers also set a complex set of rules of how much they will pay for an ad impression, or what ads should be shown to which users.
This activity generates 4 to 5 terabytes of data a day, and AOL has amassed 600 petabytes of operational data. The system maintains more than 650 billion keys, including one for every user, as well as keys for handling other aspects of data as well. The system must react to 600,000 events every second.
Data feeds produce much of this source data, which come from Web server logs and outside sources. The Hadoop Flume component is used to ingest data. The Hadoop cluster also executes a series of MapReduce jobs to parse the raw data into summaries.