Frank Silbermann

Ranch Hand
+ Follow
since Jun 06, 2002
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
In last 30 days
0
Forums and Threads

Recent posts by Frank Silbermann

OK, so I get that there is a danger with conflicting dependencies.

Does that mean that to call a restful webservice from within our Nexus library I should give it jars which have Spring's version of Jackson and Jax_rs?
If so:

How do I find which versions were included -- do I track through Maven files?
Where do I find instructions on using them outside the context of Spring?

Or does that mean I should try to avoid conflict by using completely different implementations of Jax_rs and JSON processing?
If so:

What do you suggest I use?
2 years ago
I have been tasked with doing a small piece of maintenance to a system that I think was poorly designed (but there's nothing I can do about that, now).

It's a stable system that is rarely modified.

The overall system consists of several Spring Boot webservice projects that share common code via a dependency on a Nexus library. The library is not an executable, nor is it a Spring Boot project (nor is it Spring of any kind).  It's just a Maven project that produces a Jar.  The library merely defines a huge, complex class that can be instantiated by projects that depend on this library, so they can call the methods on the instantiated object.

Within this library is a private method that directly reads an external system's RDBMS.  This private method is called by several of the public library's methods, which in turn are called by several of the Spring Boot projects that depend on this library.

My assignment is to modify that method, replacing the direct reading of the external RDBMS with a call to the external system's new restful webservice.

My experience calling restful webservices from within old Java systems was a hair-pulling experience getting things to work, and figuring out how to assemble and use the libraries that handle things like HTTP connections and conversion to and from JSON.

I have been greatly impressed by descriptions of restful webservice consumption from within Spring Boot projects using RestTemplate -- the ease of configuration, the small number of steps needed, and the automatic conversion of results from JSON to a structurally analogous Java object.

However, this library that I must modify is NOT a SpringBoot project.  And indeed, every tutorial I have read about Spring and Spring Boot seems to focus on creating executables, particularly web applications that respond to HTTP calls either from web pages or from other applications.

Is there a simple way that I can easily call a restful web service from within this library, much like I could in a Spring Web project?  That is to say, without fundamentally changing the way this library jar is imported in Eclipse, built, deployed and consumed -- merely by adding a few maven dependencies to what I now have?
2 years ago

Frank Carver wrote:

Frank Silbermann wrote:  we have hundreds of thousands of line of code the implement problem-domain rules, checks and computations.
All of which would have to be modified line by line to refer to persistent objects and data transfer objects that have a different API.

I'm trying to avoid the need to "cure our technical debt" -- since most of it is nothing more than "having written it using libraries and methods that are no longer fashionable."



I sympathise with your situation, having been in in one form of this conundrum many times.

If I can offer anything, it would be to suggest that your biggest technical debt is not the use of unfashionable libraries and methods, but the way business code is tangled in with infrastructure code, with no clear "seams" where obsolescent sections can be replaced in either area. Unfortunately this is one of the side-effects of the "vertical" DTO pattern compared with a more "horizontal" storage facade approach.

However you look at it this will be a tricky system to modernise. I hope you have good test coverage!



No test coverage.  And I submit that solutions that presume to avoid such problems are just playing whack-a-mole with the burden of change.

Yes, you can code a facade to your persistence layer.  If your new persistence layer has a reasonably similar API, and reasonably similar performance attributes, you can avoid changing your business code by re-implementing the facade.

However,  suppose you use a smart persistence tool -- such as JPA over Hibernate.  Great -- that will automatically generate not only persistence operations, but also much of your persistence facade that is used by your business layer such as the DTOs based with very little labor using reasonable defaults.  And all your business and presentation logic is written in terms of those DTOs.

Now, twenty years later let's say JPA and Hibernate are old-hat, and even though the application is pretty stable, someone says that to be hip and with it you have to use some other tool.  Whoops!  The new "reasonable defaults" are different from the old reasonable defaults -- and a great deal of painstaking work must be done to make the new system generate compatible DTOs, or to translate the generated new DTOs into the old DTOs.

In fact, one could consider much of our old code produced by our DAOGen program to be a facade over the persistence layer, generated from the database tables by using reasonable defaults.  Which we would now painfully have to translate to and from the classes that result from using JPA/Hibernate.

So instead, we may be hiring off-shore contractors to just re-implement the code from scratch using JPA/Hibernate at the bottom (because that's what's done nowadays), and coding directly to those classes.
And should the day come when someone says "Don't use JPA/Hibernate" -- the same problem will recur.

Our current code is written using JDBC.  At the time, we thought we were using a standard.

(Our move ten years ago from Sybase to Oracle told us that there really isn't any such thing as writing code to be agnostic to the persistence layer -- even just due to different capabilities of competing RDBMS.  And now we have a worse problem -- and we're not even changing the RDBMS.)
2 years ago

Tim Holloway wrote:
You're not doing yourself any favors hanging on to an ancient DIY DAO system. Unlike JPA, you can't hire people who already know its quirks, there's probably minimal documentation, and of course no training courses.
And if it's anything like the ones I've seen, it's vastly less functional than JPA. Last one I worked with couldn't even handle transactions involving more than one table.



The code that is generated serves most of our needs without having to get into the ugliness of JDBC.  We can do transactions involving multiple tables.
A very few places require more complex JDBC.  It's easy to learn because everyone already knows JDBC, and because the way the DAO and VO objects are generated for one table is exactly the same as for all the other tables.

Tim Holloway wrote:
JPA is relatively painless to migrate to. There are tools that can take an existing database and automatically generate JPA Entity classes from it. The Spring Data module can provide DAOs that can do full CRUD with as little as this (...):

That's the entire code for the DAO!



That's for one database table; we have maybe a hundred tables.  And a persistence layer that, if we can use it, doesn't require anything new to be written for it.

Tim Holloway wrote:
You might very well be able to collapse your code base down to a nice clean modern 2000 lines or so, depending on where your complex logic lies.



Not hardly.  we have hundreds of thousands of line of code the implement problem-domain rules, checks and computations.
All of which would have to be modified line by line to refer to persistent objects and data transfer objects that have a different API.


Tim Holloway wrote:
OK. Made my pitch for JPA. Be aware that Spring Boot doesn't require Spring Data in any of its variant forms. You want ugly brute force, it will allow ugly brute force. You want Spring JDBC, fine (but why???).



I'm not even sure that Spring JDBC would allow us to use our existing persistence layer -- wouldn't it have to be built over Spring's JDBC template?
(Yes, that's nicer than raw JDBC -- if we didn't already have it all.)

Again, the reasons:

1. We already have a persistence layer.
2. We heave hundreds of lines of code written to the API of our old-fashioned persistence layer.

Tim Holloway wrote:
What Spring Boot is is a system that allows you to encapsulate a web application and a web server into a single executable JAR, making for "serverless" deployments and ideal for running in containers,



Which is the reason we are pressured to convert to Spring Boot.

Tim Holloway wrote:
And it definitely won't magically cure your technical debt internal to the application.



I'm trying to avoid the need to "cure our technical debt" -- since most of it is nothing more than "having written it using libraries and methods that are no longer fashionable."
2 years ago
We have a twenty-five year-old 500,000 Java system that includes web server, batch programs and daemons.
The web application is installed in Tomcat.

The persistence layer is generated by an adaptation of an old DAOGen program.
That is, it reads the database and for each of a hundred or so tables it generates a DAO class and a VO (i.e. TDO) class.
The generated DAO classes consist of raw JDBC.  The code is written to directly access the generated methods in these classes.

Yes, I realize that this is no longer considered the ideal way to access the database.  But this is what we have.  We are asked to modify this system for the cloud, i.e. to modify it in a number of respects, including converting it into a Spring Boot application.

Spring Boot normally uses JPA (typically with Hibernate).  If we were writing a persistence layer for a new application that's what we would use.  But we already have a persistence layer to which all our code is written.  I'd prefer not to have to rewrite a 500,000 line system just for the sake of where we'll be running it.

We could do major surgery on the DAOGen program so that instead of generating raw JDBC it generates classes using the Spring JDBCTemplate, and so that the generated classes would have pretty much the same API.
But the DAOGen program is a very difficult, abstract program that we'd hate to have to rewrite.  And even then, just as annoying as it is to have to write lots of boilerplate JDBC code in a new application, it's equally annoying to have to remove boilerplate code from a huge existing system..

Is there any way nowadays that a Spring Boot application make use of classes containing raw jdbc  in code that assumes the use of Tomcat connection pools?

And advantage of converting to Spring Boot is for the ease of adding additional functionality, but I don't want to waste all the code we already have.
Are there any economical options?
2 years ago
Our team is moving from Subversion to Git, and due to peculiarities in our environment I am finding the path confusing.
We use Eclipse and we have been keeping Eclipse configuration in the repository.

The projects in our repository are highly nested, and I'm confused as to the best way to use Eclipse EGit in this context.
The eGit tutorials seem to focus on the easy case, where a Git repository contains a single Eclipse project (and typically where configuration information is not stored in the repository).


To give a sense of what I'm working with, let me describe the structure of our code repository.
The master branch is broken down into two subdirectories, call them A and B.

Taking the easier part first, nested within B is a set of independent Eclipse projects, each of which builds a utility that we developers sometimes run.

Nested within A we have projects A_client, A_webserver, A_batchPrograms, A_serverCommon, A_commonCommon, and A_lib.
Each of these is an Eclipse project, with Eclipse configuration kept in the repository.
However, these are interdependent.

A_commonCommon builds a jar that is added to the classpath of A_client, A_batchPrograms, A_webserver, A_serverCommon.

A_lib contains jar files we use but did not write, which are added to the classpath of A_serverCommon, A_batchPrograms, and A_webserver.

A_serverCommon builds a jar that is added to the classpath of A_batchPrograms and A_webserver.

A_client contains code that A_webserver deploys as both a webstart program and as an applet.

A_webserver also deploys servlets that are referenced from the applet and webstart code built in A_client.
These servlets are also called by code in A_serverCommon, and from within A_webserver itself.

Classpaths for each of these Java projects, tying these projects together, are among the Eclipse configuration kept in the repository.

Within A at the top level we also have some scripts that are used to build the entire application on the server (creating and gathering the resultant jars).


My question concerns the way to hook up these projects into Eclipse so I can use EGit for source control.

What I am thinking might work is if I do the following:

(1) Use eGit or the command line or some other tool to clone the Git repository to a directory on my workstation.

(2) Import the entire repository into Eclipse as an eGIT NON-Java project (because Eclipse doesn't understand nested Java projects very well).  I would use this top-level project for any eGit operations.

(3) Import individually each project (A_commonCommon, A_client, A_lib, A_serverCommon, A_batchPrograms, A_webserver A_) from the Git directory, each as a separate Eclipse Java project, telling Eclipse NOT to copy the files into the Eclipse workspace but to edit them in place.

The idea is that by importing all the sub-projects individually as Java projects, Eclipse will use the source-controlled configuration to tie them together.
But eGit will have to work at the level of the entire repository branch.

Does this sound like an approach that would work?  Is there a better way?

I am using Eclipse MARS for a web application using Tomcat.

Recently, we have been required to change the dev/test database URL due to an Oracle upgrade.

I made the change in my (TOMCAT HOME)/conf/server.xml file.
When I try to run the application in Eclipse I get LDAP errors because Eclipse is still looking for the datasource at the old database URL.

I've tried cleaning the project.

A colleague suggested I check in (MY ECLIPSE WORKSPACE)\.metadata\.plugins\org.eclipse.wst.server.core\tmp0\conf\server.xml

Sure enough, it contains my old database URL. But even though I change it and clean the project repeatedly, the database URL in that eclipse file keeps changing back to the old database URL.

What is going on here? How do I force Eclipse to release whatever cache continues to store my old database URL?
I need to write a program that:

(1) Joins a few huge tables (and also maybe a few small static configuration tables), producing a VERY LARGE result set, and
(2) Iterates through the result set to write an extract file.

These tables are used intensively by thousands of users for online transaction processing.
However, the rows I am selecting are no longer being actively updated.

I know how to use JDBC to select my data and to iterate through the result set.

Is there anything special I need to do to ensure that the tables themselves are not being locked while the program is run -- that my program does not interfere with ongoing transaction processing of the more current routes?

Or is this automatically taken care of by the design of Oracle's locking mechanisms.

Matthew Brown wrote:

Bear Bibeault wrote:"Buy Apple"



Young Bear goes along to greengrocer's with puzzled look on face...

Yeah, it's difficult to provide context in two words. Otherwise, better than "Buy Apple" would be to provide a date and a PowerBall number.
9 years ago

Bear Bibeault wrote: I wanted to like Scala, really I did. But what I found off-putting was the community attitude of applying shortcut after shortcut to find the least amount of characters to express a statement. At that point, it's complete gobble-dee-gook to all but the Scala veterans. Cleverness trumps clarity. And I felt that newbies are sneered at. Others may have had a different impression, but that's what I felt and so gave up on Scala.

Sure you can write impenetrable code in any language*, but the impression I got from the Scala community is that (what I call) Obfuscation Through Brevity is an honored tradition. That is so not me.

I always hated use of the word "code" to describe computer programs. To me, the word "code" implies objuscation, as in "We encoded the message so the enemy can't read it."
9 years ago
It was traditional in low-church Protestant denominations to "pass the hat" during the service into which people would drop donations.

Most synagogues have membership fees. A non-member can visit, but since many Jews attend only on two autumn holidays, most synagogues require tickets for those extremely crowded services. The tickets are mailed out to members, or can be purchased beforehand. An old joke is of a Jew who tries to get in on Yom Kippur but is refused because he has no ticket. He explains that he merely needs to pass a note to a member there. The doorman says, "OK, you can go in -- but don't let me catch you praying!"

9 years ago
For mystery, there is a huge collection of novels in the Bobsey Twins series.
9 years ago
And to paraphrase Clint Eastwood: "If you need to sing, sing; don't dance."
9 years ago

Campbell Ritchie wrote:No, if you want seven time you write bars with seven beats.
You are unlikely to fit 7 notes without stems into most bars. 7-1 time, anybody? A septuplet would divide one (or two or three) beats into seven equal parts. It is possible to have a septuplet dividing the entire bar into 7 too; there was an example in the Wikipedia link somebody posted earlier in this discussion.


After thinking it through, it occurred to me that 7-beat music would be written with quarter notes in 7/4 time.
9 years ago

Jeanne Boyarsky wrote:Frank: I live in Queens. Was that snow in 83? My parents have a picture of me it. I was very small so the snow mountains looked even bigger.



I'm thinking more of 1960-'63. When I was four feet tall, a two-foot high pile of snow would have been waist high.
9 years ago