Apache Sqoop Cookbook

Book description

Integrating data from multiple sources is essential in the age of big data, but it can be a challenging and time-consuming task. This handy cookbook provides dozens of ready-to-use recipes for using Apache Sqoop, the command-line interface application that optimizes data transfers between relational databases and Hadoop.

Sqoop is both powerful and bewildering, but with this cookbook’s problem-solution-discussion format, you’ll quickly learn how to deploy and then apply Sqoop in your environment. The authors provide MySQL, Oracle, and PostgreSQL database examples on GitHub that you can easily adapt for SQL Server, Netezza, Teradata, or other relational systems.

  • Transfer data from a single database table into your Hadoop ecosystem
  • Keep table data and Hadoop in sync by importing data incrementally
  • Import data from more than one database table
  • Customize transferred data by calling various database functions
  • Export generated, processed, or backed-up data from Hadoop to your database
  • Run Sqoop within Oozie, Hadoop’s specialized workflow scheduler
  • Load data into Hadoop’s data warehouse (Hive) or database (HBase)
  • Handle installation, connection, and syntax issues common to specific database vendors

Publisher resources

View/Submit Errata

Table of contents

  1. Apache Sqoop Cookbook
  2. Foreword
  3. Preface
    1. Sqoop 2
    2. Conventions Used in This Book
    3. Using Code Examples
    4. Safari® Books Online
    5. How to Contact Us
    6. Acknowledgments
      1. Jarcec Thanks
      2. Kathleen Thanks
  4. 1. Getting Started
    1. Downloading and Installing Sqoop
      1. Problem
      2. Solution
      3. Discussion
    2. Installing JDBC Drivers
      1. Problem
      2. Solution
      3. Discussion
    3. Installing Specialized Connectors
      1. Problem
      2. Solution
      3. Discussion
    4. Starting Sqoop
      1. Problem
      2. Solution
      3. Discussion
    5. Getting Help with Sqoop
      1. Problem
      2. Solution
      3. Discussion
  5. 2. Importing Data
    1. Transferring an Entire Table
      1. Problem
      2. Solution
      3. Discussion
    2. Specifying a Target Directory
      1. Problem
      2. Solution
      3. Discussion
    3. Importing Only a Subset of Data
      1. Problem
      2. Solution
      3. Discussion
    4. Protecting Your Password
      1. Problem
      2. Solution
      3. Discussion
    5. Using a File Format Other Than CSV
      1. Problem
      2. Solution
      3. Discussion
    6. Compressing Imported Data
      1. Problem
      2. Solution
      3. Discussion
    7. Speeding Up Transfers
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    8. Overriding Type Mapping
      1. Problem
      2. Solution
      3. Discussion
    9. Controlling Parallelism
      1. Problem
      2. Solution
      3. Discussion
    10. Encoding NULL Values
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    11. Importing All Your Tables
      1. Problem
      2. Solution
      3. Discussion
  6. 3. Incremental Import
    1. Importing Only New Data
      1. Problem
      2. Solution
      3. Discussion
    2. Incrementally Importing Mutable Data
      1. Problem
      2. Solution
      3. Discussion
    3. Preserving the Last Imported Value
      1. Problem
      2. Solution
      3. Discussion
    4. Storing Passwords in the Metastore
      1. Problem
      2. Solution
      3. Discussion
    5. Overriding the Arguments to a Saved Job
      1. Problem
      2. Solution
      3. Discussion
    6. Sharing the Metastore Between Sqoop Clients
      1. Problem
      2. Solution
      3. Discussion
  7. 4. Free-Form Query Import
    1. Importing Data from Two Tables
      1. Problem
      2. Solution
      3. Discussion
    2. Using Custom Boundary Queries
      1. Problem
      2. Solution
      3. Discussion
    3. Renaming Sqoop Job Instances
      1. Problem
      2. Solution
      3. Discussion
    4. Importing Queries with Duplicated Columns
      1. Problem
      2. Solution
      3. Discussion
  8. 5. Export
    1. Transferring Data from Hadoop
      1. Problem
      2. Solution
      3. Discussion
    2. Inserting Data in Batches
      1. Problem
      2. Solution
      3. Discussion
    3. Exporting with All-or-Nothing Semantics
      1. Problem
      2. Solution
      3. Discussion
    4. Updating an Existing Data Set
      1. Problem
      2. Solution
      3. Discussion
    5. Updating or Inserting at the Same Time
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    6. Using Stored Procedures
      1. Problem
      2. Solution
      3. Discussion
    7. Exporting into a Subset of Columns
      1. Problem
      2. Solution
      3. Discussion
    8. Encoding the NULL Value Differently
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    9. Exporting Corrupted Data
      1. Problem
      2. Solution
      3. Discussion
  9. 6. Hadoop Ecosystem Integration
    1. Scheduling Sqoop Jobs with Oozie
      1. Problem
      2. Solution
      3. Discussion
    2. Specifying Commands in Oozie
      1. Problem
      2. Solution
      3. Discussion
    3. Using Property Parameters in Oozie
      1. Problem
      2. Solution
      3. Discussion
    4. Installing JDBC Drivers in Oozie
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    5. Importing Data Directly into Hive
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    6. Using Partitioned Hive Tables
      1. Problem
      2. Solution
      3. Discussion
    7. Replacing Special Delimiters During Hive Import
      1. Problem
      2. Solution
      3. Discussion
    8. Using the Correct NULL String in Hive
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    9. Importing Data into HBase
      1. Problem
      2. Solution
      3. Discussion
    10. Importing All Rows into HBase
      1. Problem
      2. Solution
      3. Discussion
    11. Improving Performance When Importing into HBase
      1. Problem
      2. Solution
      3. Discussion
  10. 7. Specialized Connectors
    1. Overriding Imported boolean Values in PostgreSQL Direct Import
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    2. Importing a Table Stored in Custom Schema in PostgreSQL
      1. Problem
      2. Solution
      3. Discussion
    3. Exporting into PostgreSQL Using pg_bulkload
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    4. Connecting to MySQL
      1. Problem
      2. Solution
      3. Discussion
    5. Using Direct MySQL Import into Hive
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    6. Using the upsert Feature When Exporting into MySQL
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    7. Importing from Oracle
      1. Problem
      2. Solution
      3. Discussion
    8. Using Synonyms in Oracle
      1. Problem
      2. Solution
      3. Discussion
    9. Faster Transfers with Oracle
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    10. Importing into Avro with OraOop
      1. Problem
      2. Solution
      3. Discussion
    11. Choosing the Proper Connector for Oracle
      1. Problem
      2. Solution
      3. Discussion
    12. Exporting into Teradata
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    13. Using the Cloudera Teradata Connector
      1. Problem
      2. Solution
      3. Discussion
      4. See Also
    14. Using Long Column Names in Teradata
      1. Problem
      2. Solution
      3. Discussion
  11. About the Authors
  12. Colophon
  13. Copyright

Product information

  • Title: Apache Sqoop Cookbook
  • Author(s): Kathleen Ting, Jarek Jarcec Cecho
  • Release date: July 2013
  • Publisher(s): O'Reilly Media, Inc.
  • ISBN: 9781449364588