There are many tools, both IBM and open source, for moving data into an Hadoop database or HDFS file system for data exploration and analysis by data scientists and other business users. Many of these tools require a type of expertise and/or level of database access which data scientists do not typically possess. This makes it difficult for them to exploit the benefits of a Big Data Landing Zone. Data Click provides a solution. With Data Click offload activities can be defined and configured to easily move data from any ODBC data source into the Big Data Landing Zone. Data Scientists can easily request these offload activities using a Web Browser.
In this course, you will learn how a Data Click Administrator creates and configures offload activities. You will then learn how Data Click Users can run these offload requests to move data into Hadoop. You will also learn how Data Click works in conjunction with other components that come with Data Click to provide a complete end-to-end solution. Metadata Asset Manager is used to import the metadata for offload sources and targets into the Metadata Repository. The Information Governance Catalog is used browse, query, and understand the data available for offloading. The Information Server Operations Console is used to monitor offload jobs after they are initiated.
If you are enrolling in a Self Paced Virtual Classroom or Web Based Training course, before you enroll, please review the Self-Paced Virtual Classes and Web-Based Training Classes on our Terms and Conditions page, as well as the system requirements, to ensure that your system meets the minimum requirements for this course.