The Attunity Blog
Recently, Attunity was pleased to announce that the growing US dental benefits provider had selected our universal data integration solution, Attunity Replicate, as its strategic enterprise data ingest and replication platform. The Attunity software will provide Dentegra with real-time data availability and integration across its heterogeneous operational databases and analytic platforms, and is set to displace the dental insurer’s incumbent data replication technology. This initiative is expected to enable Dentegra to accelerate business solutions while reducing IT costs and labor.
Attunity is proud to be a member of the Hortonworks, Inc. Partnerworks program, a global community to jointly innovate and implement with Hortonworks integrated customer solutions for the on-premises data center and in the cloud. With certifications in HDP, HDP Yarn Ready, and HDP SEC Ready, Attunity is a founding member of Hortonworks’ the Modern Data Solutions (MDS) partner initiative.
NGP VAN is the leading technology provider to progressive political campaigns and non-profit organizations, offering clients an integrated platform of the best fundraising, compliance, field, organizing, digital, and social networking products. Nearly every major Democratic campaign in America is powered by NGP VAN, including the Obama campaign’s voter contact, volunteer, fundraising and compliance operations in all 50 states.
The story of how data scientists became sexy is mostly the story of the coupling of the mature discipline of statistics with a very young one--computer science. The term “Data Science” has emerged only recently to specifically designate a new profession that is expected to make sense of the vast stores of big data. But making sense of data has a long history and has been discussed by scientists, statisticians, librarians, computer scientists and others for years. The following timeline traces the evolution of the term “Data Science” and its use, attempts to define it, and related terms.
As we headed into 2016, Information Week had ten predictions for Big Data. Their list included the rise of the Chief Data Officer, the coming of the data-as-a-service business model, and the ability to get real-time insights from data. Looking back at 2016, we saw many of these predictions come true with what some of Attunity’s customers did with their Big Data.
The story of how data became big starts many years before the current buzz around big data. Already seventy years ago we encounter the first attempts to quantify the growth rate in the volume of data or what has popularly been known as the “information explosion” (a term first used in 1941, according to the Oxford English Dictionary). The following are the major milestones in the history of sizing data volumes plus other “firsts” in the evolution of the idea of “big data” and observations pertaining to data or information explosion.
North Bridge, a growth equity and venture capital firm, in partnership with research analyst firm Wikibon, announced the results of its sixth annual Future of Cloud Computing Survey, which analyzes trends in cloud computing, adoption, use and challenges on a yearly basis. North Bridge, a growth equity and venture capital firm, in partnership with research analyst firm Wikibon, announced the results of its sixth annual Future of Cloud Computing Survey, which analyzes trends in cloud computing, adoption, use and challenges on a yearly basis.
Itamar Ankorion, CMO of Attunity Inc., spoke to Jeff Frick (@JeffFrick), host of theCUBE*, from the SiliconANGLE Media team, during AWS re:Invent about customers’ struggles to manage their sprawling data. (*Disclosure below) He said while it was easy enough for Attunity to help them migrate some data to Amazon’s Redshift, customers needed to set up actual data centers.
Today, Attunity is thrilled to announce the availability of Attunity Compose for Amazon Redshift, a new and innovative solution for automating and accelerating data warehousing and ETL in the AWS cloud!
“The mainframe is going away” is as true now as it was 10, 20 and 30 years ago. Mainframes are still crucial in handling critical business transactions, they were however built for an era where batch data movement was the norm and can be difficult to integrate into today’s data-driven, real-time, analytics-focused business processes as well as the environments that support them.