Amazon Web Services rolls out Kenesis can handle up to terabytes of data an hour Amazon Web Services this week rolled out a new cloud-based data analytics tool named Kenesis, which can analyze massive amounts of data in real time and be paid for by the hour.Kenesis is an application that sits in the cloud and receives data from any number of sources: databases within Amazon’s cloud, like warehouse tool Redshift; NoSQL database DynamoDB; or relational database RDS. It then performs analytics on the data and spits out returns on the data. AWS developed the program using its own combination of hardware and software. The system is scalable too, able to handle up to terabytes of data an hour from potentially thousands of sources.[MORE AWS: Amazon ratchets up its enterprise focus]In announcing the tool at re:Invent, the company’s customer conference, AWS showed off how Kenesis can be used to analyze thousands of updates to Twitter in real-time, allowing queries to be performed on the data. For example, Kenesis was able to pinpoint the most popular word that was tweeted within an hour-long timespan of Tweets that were uploaded into the system. The data that Kenesis generates can then be offloaded into one of Amazon’s storage platforms like Simple Storage Service (S3). It could also be used to analyze real-time financial transactions, in-bound marketing or metering data, for example. The new service compliments data analysis tools that AWS already has. RedShift, for example, has the ability to run analyses on data stored there, but it’s meant for longer-term data that is stored in its cloud. Kenesis is meant for rapid, real-time analysis of data.Kenesis also fits in well with a growing number of Amazon partner companies who offer tools to help make sense of data that AWS analyzes. Jaspersoft, for example, is a company that can take the results of queries that RedShift has done and create visualizations from it and set up alerts. That sort of platform is a natural fit for being able to provide customers actionable insight from analysis that AWS performs. The move represents AWS’s continued push into giving customers more options for analyzing their data as well. AWS already has a Hadoop system named Elastic Map Reduce (EMR), which is a pay-by-the-hour Hadoop cluster. S3 has scaled to store literally trillions of objects in AWS’s cloud. Having new tools to be able to run analytics jobs on all that data is an area experts were expecting AWS to make announcements in at re:Invent.The service was released in limited preview starting today.Senior Writer Brandon Butler covers cloud computing for Network World and NetworkWorld.com. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW. Read his Cloud Chronicles here. http://www.networkworld.com/community/blog/26163 Related content news AWS launches Cost Optimization Hub to help curb cloud expenses At its ongoing re:Invent 2023 conference, the cloud service provider introduced several new and free updates that are expected to help enterprises optimize their AWS costs. By Anirban Ghoshal Nov 28, 2023 3 mins Amazon re:Invent Events Industry how-to Getting started on the Linux (or Unix) command line, Part 4 Pipes, aliases and scripts make Linux so much easier to use. By Sandra Henry-Stocker Nov 27, 2023 4 mins Linux news AI partly to blame for spike in data center costs Low vacancies and the cost of AI have driven up colocation fees by 15%, DatacenterHawk reports. By Andy Patrizio Nov 27, 2023 4 mins Generative AI Data Center news Nvidia’s made-for-China chip delayed due to integration issues: Report Nvidia’s AI-focused H20 GPUs bypass US restrictions on China’s silicon access, including limits on-chip performance and density. By Sam Reynolds Nov 24, 2023 4 mins CPUs and Processors Generative AI Technology Industry Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe