Marshall Blythe

Ranch Hand
+ Follow
since Feb 26, 2013
Cows and Likes
Cows
Total received
0
In last 30 days
0
Total given
0
Likes
Total received
8
Received in last 30 days
0
Total given
5
Given in last 30 days
0
Forums and Threads
Scavenger Hunt
expand Ranch Hand Scavenger Hunt
expand Greenhorn Scavenger Hunt

Recent posts by Marshall Blythe

Irrespective of your choice in security frameworks, you should definitely become familiar with the OWASP project and its Top 10 web application security flaws. I've found these to be invaluable resources when designing secure web applications.
6 years ago

Dave Tolls wrote:
Why are you not simply forwarding to the JSP pages?
Why do a redirect?



The OP may be implementing the Post-Redirect-Get pattern. However, you could make an argument for forwarding in cases like this where an error prevents the state from actually changing.
6 years ago

david dabush wrote:
Is there any annotation to prevent bindig of field b?



You can control binding very precisely by adding an @InitBinder method to your controller. This method can accept an argument of type WebDataBinder which allows you to selectively allow or disallow binding for specific fields. For example, to disable binding to any field named "b" you could do this:



7 years ago

Emanuele Mazzante wrote:
)- Composite primary key: I can remove the auto increment Person_ID field and make a composite field with all the remaining fields. In this case I can catch duplicate SQL exception and also inform the user about it. But this way is considered a good design? May be too redundant to have a field that relates the values ​​of other four fields? May adversely affect the performance?



At this point I'd be more concerned about simplicity and correctness than performance. I've used this technique many times with with no adverse performance effects. True, there is some overhead in the database to maintain the index for the unique constraint, but this is what databases are designed to do. I think you'll find that the performance impact of performing a pre-select is greater than that for the index maintenance. At any rate, you won't know if you have a performance concern until you put a solution into effect and measure the results. Just define the constraint on the minimum subset of columns required to define uniqueness, and let the database do the rest.

Emanuele Mazzante wrote:
)- Stored procedure and trigger: I have not yet studied it, however, I do not know if this allows me to inform the user of the non-inclusion because of the attempted duplication.
)- It could be possible to make something like this: INSERT INTO People (field1, field2,....) VALUES (value1, value2,....) WHERE NOT EXIST (SELECT * FROM People WHERE field1 = value1 AND field2 = value2 AND....) or I said a very stupid thing?



I think these just shift the potential for a race condition from your Java app to the database because they both ultimately rely on a query being executed BEFORE the insert.

Emanuele Mazzante wrote:
Normally I would execute a select query on the db and if no record are returned, I insert the data. In my small test program may work, but I don't think that this method is suitable in a project which there are many connected users. Between the search and the insertion can be created the same record type.



You're right to be concerned. The "select-before-insert" pattern sets up a classic race condition, and it should be avoided. Besides, duplicates are probably going to be fairly rare in reality, and the initial query would be a waste of resources. Just attempt the insert and let the database tell you if it failed. Make the database work for you: define a unique constraint on the desired columns and let the database reject any attempt to insert duplicates. Whenever an insert attempt causes a unique constraint violation a SQLException will be thrown with a vendor-specific error code, and you can catch and inspect this in your application. When the error code indicates that a unique constraint violation has occurred you can display a user-friendly error message to the user. Otherwise the SQLException can be re-thrown or treated as a fault condition as appropriate for your application.
Assuming your method actually throws an exception as Jayesh mentioned, there's one other thing to consider. By default Spring's transaction management will perform a rollback for unchecked exceptions (i.e. subclasses of RuntimeException). If you want it to rollback for checked exceptions (i.e. subclasses of Exception) then you must specifically list those exceptions in the rollbackFor property of the @Transactional attribute. For example:



You can read more about the various rollback options in the rollback rules section of the Spring reference docs and the JavaDoc for the @Transactional annotation.
7 years ago

Stephan van Hulst wrote:Personally I love the combination of Maven and Git. That way all I have to worry about are the program sources, documents and the POM. Almost no clutter!



Agreed: I've been doing this with SVN instead of Git, but the end result is the same. The m2e plugin for Eclipse does a nice job of generating the various Eclipse project files from the POM.
Right-click on your errant project in Eclipse and choose Properties --> Build Path from the menu, then click on the "Libraries" tab. Do you see a JRE system library listed there? If not then that could explain the problem, and you'll need to add the missing library for your chosen JRE. Why did this suddenly stop working? I dunno- Eclipse can be a trying and fickle beast at times.
Have you enabled annotation-driven transactions with a line like this in your Spring config file?



If you have then try enabling DEBUG-level logging for the org.springframework package and exercise the code to see what output is logged. This will let you see what Spring is doing behind the scenes; e.g. which transactional methods it discovers, committing or rolling-back at transactional boundaries, etc. I've done this before to diagnose transaction-related problems.
7 years ago
I generally favor prevention wherever practical, but it can be tricky to cover all scenarios. Consider Paul's example of a GUI with a grayed-out delete button in the context of a typical web application. You can render a web page dynamically and enable certain delete buttons while disabling others according to the business rules, and that will be sufficient to safeguard against your typical, benign user. However, an unethical person could use a tool like Fiddler to generate practically any HTTP request imaginable including the one that would be sent if the user were able to click one of the disabled delete buttons. The server-side portion of the application can never trust the client: even though it just sent a page to the client that should prevent the user from deleting item #123 the very next HTTP request it handles could be one targeting the deletion of item #123. You have to implement these safeguards on the server too- that's where the buck really stops.
7 years ago

Greg Bag wrote:The code above shows connection pooling.



Yes, you're handling the connection properly, but you may need to fine-tune the connection pool to ensure that it performs well under load (e.g. check the max # of pooled connections, connection eviction policy etc.). That can come later during the load/stress testing phase of your project.
7 years ago

Greg Bag wrote:So is this personDaoImpl thread-safe? If I was to use a single instance variable in the servlet, that wouldn't pose an issue? What if 1000 requests call createPerson at the same time on the same instance of PersonDaoImpl?



It's thread-safe, but you need to consider using the connection pooling features of your DataSource if your servlet is going to be handling hundreds of concurrent requests. The connection pool is usually administered in the JEE container and is transparent to the client application. Check your JDBC driver and JEE container documentation to see how to setup a connection pool.
7 years ago

Greg Bag wrote:
Does the getConnection() pose a concurrency issue?
Can the 1000 different requests pose a threat to the datasource object from the above code?



The DataSource is just an interface, but most implementations that you obtain via JNDI lookup in a JEE container are thread-safe. In addition, most of them support connection pooling. Consult the documentation for your JDBC driver to be sure.

Greg Bag wrote:
What if there was a private PersonDao personDao = new PersonDaoImpl() as an instance in the servlet. Now what happens?



As long as the PersonDaoImpl is thread-safe then there's no harm in keeping a reference to it in an instance variable.

Greg Bag wrote:
What I'm really confused on is what is happening inside the doGet when the PersonDaoImpl is instantiated. Can someone give me a walkthrough please. The gist of my question is if the code I have up there is thread-safe.



Since the DataSource is looked up once during static initialization of the AbstractDao the instantiation of PersonDaoImpl is very light weight. So long as your DataSource is thread-safe (and most are) then the rest of the code you posted is thread-safe.
7 years ago
According to the JavaDoc for Transactional#readOnly:

This just serves as a hint for the actual transaction subsystem; it will not necessarily cause failure of write access attempts. A transaction manager which cannot interpret the read-only hint will not throw an exception when asked for a read-only transaction.



Why is readOnly set to true on a method that performs multiple inserts? Although it causes no real harm it could be confusing for the next developer who comes along to maintain the code.
8 years ago
I think he's referring to the various objects like request, response, session, page, etc. that the container makes available without explicit declaration. If you're going to disable these (assuming that's even possible) then you might as well write a static HTML page instead of a JSP. Where do interviewers come up with these fringe questions?
8 years ago
JSP