Tags » Hadoop

Hadoop : write and read (part-2) (spark-scala)

Spark provides a very easy and concise apis to work with Hadoop read and write process.

First step would be to create SparkContext and SQLContext… 225 more words


hadoop: write and read (part-1) (scala)

This is the first part of the write to Hadoop and read from Hadoop series. In this first series I will show you scala way; 561 more words


Exception from container-launch. ... Exit code: 1

I encountered following issue while I was running map-reduce code in my local yarn single node cluster.

Exception from container-launch.
Container id: container_1524296901175_0004_01_000002
Exit code: 1… 523 more words


HBase SQL statement fails with Insufficient permissions for user


Below error comes up when creating new hbase table

hbase(main):001:0> create ‘anoop’,’cf1′
ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user ‘anoop’ (global, action=CREATE)
at org.apache.hadoop.hbase.security.access.AccessController.requirePermission(AccessController.java:426) 103 more words


HBASE snapshots How To

What is a Snapshot?

A snapshot is a set of metadata information that allows an admin to get back to a previous state of the table. 142 more words


Crontab not working for kerberized hadoop

ISSUE:- Cronjobs not working for kerberos enabled hadoop. Throwing below error.

ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed ; 164 more words


Like BigData tools you won’t need AI 99% of the time . #bigdata #data #machinelearning #ai #hadoop #spark #kafka

The Prologue.

Recently I’ve been very curious, I know that alone makes people in tech really nervous. I was curious to find out the first mentions of BigData and Hadoop in this blog, April 2012 and the previous year I’d been doing a lot of reading on cloud technologies and moreover data, my thirty year focus is data and right now in 2017 I’m halfway through. 1,006 more words