Topics In Demand
Notification
New

No notification found.

Blog
Oracle connects the data swamps

March 29, 2018

745

0

Oracle has announced the launch of the Oracle Data Integrator Cloud Service. This service is designed to extract, load and transform (E-LT) data into big data repositories. It supports Oracle Database Cloud Service, Exadata Cloud Service, and Big Data Cloud Service. There is also support for non-Oracle services. What is key to the architecture is that it does not need to transfer the data to another location in order to transform it. This is important for a cloud data transfer service. It reduces the number of transfers that other services take and increases the speed accordingly.

Flexible service

Jeffrey T. Pollock, Vice President of Product Management, Oracle (Image Source LinkedIn/Jeff Pollock Jeffrey T. Pollock, Vice President of Product Management, Oracle The solution works with on-premises solutions and in the cloud. It supports several different databases and can ingest data from multiple sources such as Hadoop, applications, JDBC, files, XML, JSON and web services. Jeff Pollock, vice president of product management, Oracle commented: “To be effective and agile, enterprises need seamless communication and flow of data between sources and targets – data originating from IoT, Web, and business applications or data that is stored in the cloud or on premises. Oracle Data Integrator Cloud provides businesses with a high-performance, simple, and integrated cloud service to execute data transformations where the data lies, with no hand coding required, and without having to copy data unnecessarily.”

Designed for speed

To improve the speed of E-LT, Oracle has designed a new architecture and is taking advantage of parallel processing where appropriate. Businesses selecting the service are able to take advantage of the integrations into Oracle’s existing PaaS solutions, including Oracle Database Cloud, Oracle Database Exadata Cloud, and Oracle Big Data Cloud. There are also pre-built integrations for big data technologies such as Hive, HDFS, HBase, and Sqoop. Oracle also provides several code templates, known as Knowledge modules. These decrease the time taken to begin the data movements. They include: Reverse engineer metadata, journalize (CDC), load from source to staging, check constraints, integrate transform and data service. By getting training on ODI Training you will learn more about ODI integration with Cloud.

As they are reusable, companies are able to add new sets of data to ingest easily and rapidly consolidate their data from different sources. By decoupling the design, ODI mappings can transfer between underlying technologies without the need for extensive re-coding. The new Oracle service transfers data rapidly between data lakes and is also rapid to configure and set up. The graphical user interface is intuitive and allows developers to map data fields quickly and easily.

How much does it cost?

Oracle provides two sets of pricing: Non-Metered and Metered. Non-Metered pricing is $2,000/month/ Host OCPU. It is the OCPU’s running the Oracle Data Integrator Cloud Service instance that is counted. For metered pricing, there are two elements. Either $4,000/month/Host OCPU of $6.72/Hour/Host OCPU. This second option is interesting. It should allow companies to build templates and expertise to carry out the initial E-LT of data. If they need to repeat the transfer at a later date they can use a metered solution to cut the costs down.

Conclusion

Oracle has produced an E-LT service that is suitable for migrating the large data lakes that analytics requires without some of the issues. That it doesn’t need to take an interim copy of the data is key. he parallels processing capability is not explained in detail and it would be interesting to see some actual benchmarks for this service against competitors. As companies look to consolidate their data for improved analysis this tool should gain the interest of several companies. It could also become a key player as Oracle looks to persuade companies to move into the cloud ERP. Ingesting data quickly between on-premises and cloud databases is a challenge that companies need to overcome. For those considering moving to an ERP solution, this is a major hurdle.


That the contents of third-party articles/blogs published here on the website, and the interpretation of all information in the article/blogs such as data, maps, numbers, opinions etc. displayed in the article/blogs and views or the opinions expressed within the content are solely of the author's; and do not reflect the opinions and beliefs of NASSCOM or its affiliates in any manner. NASSCOM does not take any liability w.r.t. content in any manner and will not be liable in any manner whatsoever for any kind of liability arising out of any act, error or omission. The contents of third-party article/blogs published, are provided solely as convenience; and the presence of these articles/blogs should not, under any circumstances, be considered as an endorsement of the contents by NASSCOM in any manner; and if you chose to access these articles/blogs , you do so at your own risk.


© Copyright nasscom. All Rights Reserved.