Hi everyone,
I need some information to take a decision which matters a lot to me. I know this is a
bit lengthy and I request you to read through it.
Currently I am working as an application developer with the
Java language and
frameworks like Spring and Hibernate. Sometimes I do have to write custom frameworks
for specific purposes. I like to design application / algorithms and frameworks more than
just learning an existing framework and using it. Yes, I do use and love Vim as my primary
editor and I love to code.
Recently my boss has offered me a position in which I will have to work with big data.
Below are the prerequisite skills for for the position:
hands-on experience in Core Java, Unix/Linux, SQL and good analytical skills to grasp
and apply the concepts in Hadoop.
- Mandatory: Core java, Unix / Linux, SQL
- Good to have: Python, Spring / .Net / C++
- Highly desirable: Linux administration, Big Data Skills (Hadoop, HBase, Cassandra,
Spark, Splunk, Marklogic, MongoDB etc)
I have an intermediate understanding of SQL, Python, Unix/Linux. From a quick
search I have found that big data is about huge volumes of data and Hadoop is a framework
for handling such data.
Having read what I enjoy doing, do you guys think I will be comfortable with
big data? Will I get to design some framework / algorithm or write code from
scratch? Is big data only about data analysis and making sense of the data to a business?
Any light in this matter would be of immense help.
Regards,
Ven.