Srinivas Mupparapu

Greenhorn
+ Follow
since Feb 12, 2004
Srinivas likes ...
Java
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
0
In last 30 days
0
Total given
0
Likes
Total received
0
Received in last 30 days
0
Total given
0
Given in last 30 days
0
Forums and Threads
Scavenger Hunt
expand Ranch Hand Scavenger Hunt
expand Greenhorn Scavenger Hunt

Recent posts by Srinivas Mupparapu

Hi Romain,
I thank you for taking time for authoring the subject of performance which is typically not given enough attention during the application development. I wish there is a section to include patterns for securing application cached data. Do you have any insights into application security features in Java 8?

Thank you

Satyaprakash Joshii wrote:I want to know: What is in Hadoop map reduce which was not in Google Map Reduce?



The difference is that Hadoop is open source Apache software whereas Google is not. Hadoop is built based on a white paper that Google has published on Map Reduce. Look at Hadoop's history for more info.
11 years ago
You need inspect your use case care fully. Showing a large amounts of data in a JSP is not necessarily a good practice. There is no surprise that it does take long time to process terrabytes of data. One thing you may want to do is run your process in a batch mode (example: nightly or few times in a day) and have the result stored in HDFS. Have your JSP read the readily available result from HDF and display it on demand.
11 years ago
MapReduce uses a record reader behind the scenes which by default reads one line at a time. You can override this behviour using a customre record reader and take control of what constitutes a record. Look into org.apache.hadoop.mapred.RecordReader interface. There are several implementations of this interface available out of the box.
11 years ago
Are there any best practices to manage logical data deletion in HBase table. I have a use case where I need to keep the data in HBase table forever and never physically delete it for historic and auditing purposes. When an application calls a delete on a row I need to translate the call into a logical delete instead of a physical delete. I manage the row versioning by suffixing the row key with a timestamp and I am not using the HBase's internal versioning mechanism.

I appreciate any insight.
11 years ago
To add to what Tibi Kiss has said above, one can use Hadoop to store large data in it and use MapReduce framework to index the data using Lucene. You can then make the resultane Lucene index documents searchable using Solr.
11 years ago
There are coulple of options depending on your file format:
- As you mentioned, you can use FSFileSystem in a Servlet to read and process the data and display it in your JSP.
- You could create a Hive external table and use HiveQL with JDBC to retrieve your data into your JSP. Keep in mind that HiveQL is similar to SQL but with limited features.
11 years ago
In addition to what Junilu have said above, going throug the book "Hadoop: The Definitive Guide" by Tom White should give you a fair idea on what you need to learn. A typical use case of Hadoop involves the use of Hive and/or Pig (if Hadoop MR is not sufficient) to process the data stored in HDFS. If you need random read/write access to your data stored in HDFS then you also need to look into HBase.
11 years ago
Hello Nakataa,
In an MR program you need to specify the output format for your "reduce" program and final output depends on it. The final result from the reduce program is written to the output folder (in HDFS) that you have specified on your MR Job specification. Number of output files from the reduce program equals the number of reducers that you have specified on your Job Config object.

If you are running HBase MR with the TableOutputFormat as your destiniation then your result is in the HBas table that you specified when you created the job.

Include your code in this post from your MR Driver class if you need further help.
11 years ago
distcp is for copying large amounts of data to and from Hadoop filesystems in parallel. Haven't heard of anyone using it to copy files from non-hdfs to hdfs. I am curious to know if you have solved your problem.
11 years ago
I ran into the same problem when I was trying to create a new web service. The webservices.xml file contains only the xml prolog. It had some web service definition before but was removed manually. Please advise.
15 years ago
This is a new feature only available in WAS 5. As IBM continues to tighten the noose on developers ability to do interesting things in the app, even the ability to create a thread in a servlet will be denied. This is only a warning and you will need to come up with an alternate way to make db calls from a thread in the future. Maybe use an HTTP post at the designated interval???

Visit this page for more info:
http://www-1.ibm.com/support/docview.wss?rs=180&context=SSEQTP&q=J2CA0075W&uid=swg21109248&loc=en_US&cs=utf-8&lang=en+en
19 years ago
Hi All,
While compiling EJB classes from command prompt I am getting these errors:
package javax.ejb does not exist
cannot resolve symbol EJBObject
and so on.
I beleive this means necessary jar files containing these package/classes are missing in the CLASSPATH variable.
I have opened all the jar files that are under my J2EE installation directory but could not find one that has javax.ejb package in it.
As someone adviced I also tried running setEnv.bat file. But it did not help.
Any help is greatly appreciated.
Thank you,
Srini
Hi All,
While compiling EJB classes from command prompt I am getting these errors:
package javax.ejb does not exist
cannot resolve symbol EJBObject
and so on.
I beleive this means necessary jar files containing these package/classes are missing in the CLASSPATH variable.
I have opened all the jar files that are under my J2EE installation directory but could not find one that has javax.ejb package in it.
As someone adviced I also tried running setEnv.bat file. But it did not help.
Any help is greatly appreciated.
Thank you,
Srini
20 years ago
You can specify only one file with each include directive:
Example: <%@ include file="test.jsp" %>
You can pass parameters to included JSP file using <jsp:include> as follows:
<jsp:include page="paramTest.jsp" >
<jsp aram name="firstName" value="Bush" />
</jsp:include>
Hope this will help you.
[ February 12, 2004: Message edited by: Srinivas Mupparapu ]
[ February 12, 2004: Message edited by: Srinivas Mupparapu ]
20 years ago
JSP