Sunday, June 24, 2012

Convert PHP code to c++ code

HipHop is a source code transformer which transforms PHP source code into highly optimized C++ and then compiles it using g++. Currently supported platforms are Linux and FreeBSD. There is no OS X support.
Download it from HERE

Wednesday, June 20, 2012

Tuesday, June 19, 2012

E:Encountered a section with no Package: header,,E:Problem with MergeList /var/lib/apt/lists /us.archive.ubuntu.com_ubuntu_dists_natty_main_binary-i386_Packages, E:The package lists or status file could not be parsed or opened.

Problem :
E:Encountered a section with no Package: header,,E:Problem with MergeList /var/lib/apt/lists /us.archive.ubuntu.com_ubuntu_dists_natty_main_binary-i386_Packages,
E:The package lists or status file could not be parsed or opened.

Probable Reason:
Installation process interrupted abruptly. or some important installation process not completed correctly.

Probable Solution:

Try following commands :

sudo rm /var/lib/apt/lists/* -vf
sudo apt-get update

CloudFront: HOW TO MOVE DATA INTO AN HBASE TABLE USING FLUME-N...

CloudFront: HOW TO MOVE DATA INTO AN HBASE TABLE USING FLUME-N...: The first Hbase sink was commited to the Flume 1.2.x trunk few days ago. In this post we'll see how we can use this sink to collect data f...

Monday, June 18, 2012

Want to experiment with android tablet

Install android apps from google play using pc.
Just login to chrome with your google id and login with same id inyour tablet or android phone, and go to google play using your pc' as it will be easy for you to install apps using a pc, saerch applications and install so whenever your android tablet or phone will be connected to internet your choosen apps will automatically be downloaded and synchronised to your android device.
You just need to login with same google account in your pc as well as your android device.

Friday, June 15, 2012

CloudFront: Tips for Hadoop newbies (Part I).

CloudFront: Tips for Hadoop newbies (Part I).: Few moths ago, after completing my graduation I thought of doing something new. In quest of that I started learning and working on Apache's...

CloudFront: How to install maven3 on ubuntu 11.10

CloudFront: How to install maven3 on ubuntu 11.10: If you are trying to install maven2 that comes shipped with your ubutnu 11.10, and it is not working as intended you can try following steps...

CloudFront: Error while executing MapReduce WordCount program ...

CloudFront: Error while executing MapReduce WordCount program ...: Quite often I see questions from people who are comparatively new to the Hadoop world or just starting their Hadoop journey that they are ge...

CloudFront: HOW TO CHANGE THE DEFAULT KEY-VALUE SEPARATOR OF A...

CloudFront: HOW TO CHANGE THE DEFAULT KEY-VALUE SEPARATOR OF A...: The default MapReduce output format, TextOutputFormat , writes records as lines of text. Its keys and values may be of any type, since Text...

CloudFront: HOW TO MOVE DATA INTO AN HBASE TABLE USING FLUME-N...

CloudFront: HOW TO MOVE DATA INTO AN HBASE TABLE USING FLUME-N...: The first Hbase sink was commited to the Flume 1.2.x trunk few days ago. In this post we'll see how we can use this sink to collect data f...

Thursday, June 14, 2012

CodePool: How to print A to Z in Java easily

CodePool: How to print A to Z in Java easily: To print the alphabets from A-Z in Java without any hassles you just need a for loop like this : For lower case :         for(char ch='...

CloudFront: Error while executing MapReduce WordCount program ...

CloudFront: Error while executing MapReduce WordCount program ...: Quite often I see questions from people who are comparatively new to the Hadoop world or just starting their Hadoop journey that they are ge...

CloudFront: Tips for Hadoop newbies (Part I).

CloudFront: Tips for Hadoop newbies (Part I).: Few moths ago, after completing my graduation I thought of doing something new. In quest of that I started learning and working on Apache's...

Wednesday, June 13, 2012

Tuesday, June 12, 2012

Java Forecast 4u: Why the methods of interfaces are public and abstr...

Java Forecast 4u: Why the methods of interfaces are public and abstr...: Interface methods are: public since they should be available to third party vendors to provide implementation.  and   abstract becaus...

Java Forecast 4u: What is Lazy Loading ?

Java Forecast 4u: What is Lazy Loading ?: Lazy loading decides whether to load the child objects while loading the parent object. we need to do this setting in Hibernate mapping ...

Java Forecast 4u: Input/Outpur( I/O) Stream

Java Forecast 4u: Input/Outpur( I/O) Stream: I/O stands for input output. Input stream are used to read the data from input devices. Output stream are used to write the data to output...

Java Forecast 4u: what is Configuration class in Hibernate?

Java Forecast 4u: what is Configuration class in Hibernate?: Configuration is a class which is available in “org.hibernate.cfg “package . Hibernate runtime system will be stored by installing con...

java.util.concurrent.RejectedExecutionException

Problem:
Exception : java.util.concurrent.RejectedExecutionException while performing any operation on hbase.

Probable Solution and Reason:
You may have closed the table object somewhere, of if the table is closed nothing will happen :) just check if you have closed the table and trying to perform operation using that table object .

Configure eclipse for map reduce and writing sample word count program

If this video does not play please visit :

http://www.youtube.com/watch?v=TavehEdfNDk



This will show how to configure eclipse for running Hadoop mapreduce program. configuration and a sample word count program which you can get from 
http://wiki.apache.org/hadoop/WordCount


Download eclipse jar from here :
https://dl.dropbox.com/u/19454506/hadoop-eclipse-plugin-0.20.203.0.jar
this jar will work for newer version of hadoop too. copy this jar file to eclipse plugin directory and follow the video.



If above video is not playing please visit :http://www.youtube.com/watch?v=TavehEdfNDk

HBase components and Know what......

It is one of the cool projects from Apache, that enable a facility to provide a large scale scale able, distributed database, based on Hadoop. In this data is organised as row columns that can grow infinitely as you add up new nodes. No need to reconfigure and mess up much with the configuration setting.
This requires Java and Hadoop to run full fledged manner.

Components:

HBaseMaster :

The HBaseMaster is responsible for assigning regions to HRegionServers. The first region to be assigned is the ROOT region which locates all the META regions to be assigned. The HBaseMaster also monitors the health of each HRegionServer, and if it detects a HRegionServer is no longer reachable, it will split the HRegionServer's write-ahead log so that there is now one write-ahead log for each region that the HRegionServer was serving. After it has accomplished this, it will reassign the regions that were being served by the unreachable HRegionServer. In addition, the HBaseMaster is also responsible for handling table administrative functions such as on/off-lining of tables, changes to the table schema (adding and removing column families), etc.

HRegionServer:
The HRegionServer is responsible for handling client read and write requests. It communicates with the BaseMaster to get a list of regions to serve and to tell the master that it is alive. Region assignments and
other instructions from the master "piggy back" on the heart beat messages.

HBase client:
The HBase client is responsible for finding HRegionServers that are serving the particular row range of interest. On instantiation, the HBase client communicates with the HBaseMaster to find the location of the ROOT region. This is the only communication between the client and the master.




Inherited from  : Here 
From : Research paper of (Ankur Khetrapal, Vinay Ganesh)

Thursday, June 7, 2012

Deep Copy and Shallow Copy in OOPS

Shallow Copy :
This does a bit-wise copy of an object. So when is new object is created it will have exact copy of the object, this is where problem comes, suppose the object which is to be cloned has some variable as reference or a reference variable pointing to some other data or object, then in the new object clone will contain the reference to the old object data only,


***Soon i will add image to clarify this concept ***


Deep Copy: 
It will be like duplicate of the object, in this copy the new object or variable of referenced data will be created.


***Soon i will add image to clarify this concept ***

Compile a .cs file which is located in different folder of the disk from another .cs program


ProcessStartInfo info = new ProcessStartInfo(@"C:\Windows\Microsoft.NET\Framework\v3.5\csc.exe");
info.Arguments = @" /out:C:\ss\Class1.dll C:\ss\Class1.cs";
info.UseShellExecute = false;
Process.Start(info);

Wednesday, June 6, 2012

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask, java.io.IOException: Exception reading file:/

Exception in hive :

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask, java.io.IOException: Exception reading file:/

Error Stack :


Error initializing attempt_201206070234_0004_m_000002_0:
java.io.IOException: Exception reading file:/../Hadoop/hdfs/tmp/mapred/local/ttprivate/taskTracker/shashwat/jobcache/job_201206070234_0004/jobToken
at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:135)
t org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1154) at org
at org.apache.hadoop.mapreduce.security.TokenCache.loadTokens(TokenCache.java:165)

Tuesday, June 5, 2012

FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/user. Name node is in safe mode.

Error Message:


FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/user. Name node is in safe mode.

Solution:

FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Unexpected exception caught. NestedThrowables: java.lang.reflect.InvocationTargetException FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

Error in hive :
 
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Unexpected exception caught.
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask


Probable Solution:

If you have copied extra jars from somewhere to hive live folder, that is causing the problem, so remove the jars that you have added and then try.

and also if you have defined the aux_jars check if different jars are colliding in the path.

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/common/LogUtils$LogInitializationException,

Exception :

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/CommandNeedRetryException

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/common/LogUtils$LogInitializationException

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.CommandNeedRetryException

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.common.LogUtils$LogInitializationException


Solution :

In hadoop-env.sh if you are specifying HADOOP_CLASSPATH then specify like :

export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:/home/shashwat/Hadoop/hadoop/lib:/home/shashwat/Hadoop/hadoop

Bihar board intermediate result


Not for programmers : )




Visit this link


http://results.bihareducation.net/





or


Send ON RESULTSALERT to9870807070

Inheritance Example


Featured Posts

#Linux Commands Unveiled: #date, #uname, #hostname, #hostid, #arch, #nproc

 #Linux Commands Unveiled: #date, #uname, #hostname, #hostid, #arch, #nproc Linux is an open-source operating system that is loved by millio...