QlikView is a business intelligence tool that allows data to be discovered and visualized. QlikView supports Hadoop environments as a data source. Read this article for details about how QlikView was tested to integrate with and visualize data in Hortonworks Data Platform (HDP) on IBM POWER8. Hortonworks HDP 2.5.3.0, phoenix-4.7.0.2.5.3.0-37, Hbase 1.1.2, SQuirrel Client version 3.8.1. I have created 3 node cluster using Ambari Installation (ambari - 2.5.1) on AWS EC2 Instance. The cluster is un-kerberised(Unsecure). I am able to access the Phoenix layer by using sqlline.py. I have followed the below link to configure SQuirrel.
Qlik provides a business intelligence (BI) solution called QlikView. QlikView provides many features beyond the typical BI reports and dashboards. Example capabilities include guided analytics, security, customization, and scalability. QlikView supports accessing data in Apache Hadoop environments. Validation testing was performed to verify QlikView's ability to integrate with and visualize data specifically with Hortonworks Data Platform (HDP) on IBM® POWER8® processor-based servers. This article provides an overview of the validation tests that were completed. Objectives The key objectives for the validation testing of QlikView were to:.
Configure QlikView to connect to HDP 2.6 running on an IBM POWER8 processor-based server. Extract and visualize sample data from the Hadoop Distributed File System (HDFS) of HDP running on a POWER8 processor-based server. Test environment This section lists the high-level components of QlikView and HDP used in the test environment. QlikView. QlikView Personal Edition 12 for Microsoft Windows.
Hortonworks ODBC Driver for Apache Hive v2.1.5. A notebook running Windows 7 Hortonworks Data Platform. HDP version 2.6. Red Hat Enterprise Linux (RHEL) version 7.2.
Minimum resources: Eight virtual processors, 24 GB memory, 50 GB disk space. IBM PowerKVM™. IBM POWER8 processor-based server Deployment architecture The deployment architecture is quite simple. QlikView and the Hortonworks ODBC driver were installed and run on a Windows 7 system. HDP was installed and run on a POWER8 server.
QlikView and the ODBC driver were configured to connect to HDP. Data in HDP was accessed and visualized by QlikView. Tests were run in a single-node HDP environment and a multi-node HDP cluster. Installation and configuration The section covers installation and configuration of a HDP cluster and QlikView software. Installing and configuring the HDP cluster Here are the high-level steps to install and configure the HDP cluster:. Follow the installation guide for HDP on Power Systems (see ) to install and configure the HDP cluster.
![Hortonworks Hive Driver For Mac For Hdp 2.5.3 Hortonworks Hive Driver For Mac For Hdp 2.5.3](https://community.hortonworks.com/storage/attachments/65024-capture.jpg)
Log in to the Ambari server and ensure that all the services are running. Monitor and manage the HDP cluster, Hadoop, and related services through Ambari. Setting up test data and Hive tables Download the MovieLens and driver test data, copy the data to HDFS, and create Hive tables. Download the MovieLens data set from (see the citation in ). Follow the instructions to copy the MovieLens dataset data to HDFS and set up Hive external tables. Use hive user ID for the same.
Download the driver data file from the Driver Behavior data file from. Copy the driver data to HDFS. # su – hive # hadoop fs -mkdir -p /user/hive/dataset/drivers # hadoop fs -copyFromLocal /home/np/u0014213/Data/truckeventtextpartition.csv /user/hive/dataset/drivers # hadoop fs -copyFromLocal /home/np/u0014213/Data/drivers.csv /user/hive/dataset/drivers # hadoop fs -ls /user/hive/dataset/drivers Found 2 items -rw-r-r- 3 hive hdfs 2043 2017-05-21 06:30 /user/hive/dataset/drivers/drivers.csv -rw-r-r- 3 hive hdfs 22-05-21 06:30 /user/hive/dataset/drivers/truckeventtextpartition.csv. Create Hive tables for driver data. Connect to the HIVE2 server running on HDP 2.6 instance running on the IBM POWER8 processor-based server as shown in Figure 3.
Select the ODBC data source added from the ODBC Administrator in the previous step. Provide the Hive user name and password (use the Hive DB password and not the Hive UNIX user password). The connection to HIVE2 must succeed in order to continue. Note: If you have already created a Hive DB and tables with a user name and password, use the same user credentials here as well. In this test, we used hive as the user name and Ibmpdp as the password. Connecting to HDP.