• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Liutauras Vilda
  • Bear Bibeault
  • Tim Cooke
  • Junilu Lacar
Sheriffs:
  • Paul Clapham
  • Devaka Cooray
  • Knute Snortum
Saloon Keepers:
  • Ron McLeod
  • Tim Moores
  • Stephan van Hulst
  • Tim Holloway
  • Frits Walraven
Bartenders:
  • Carey Brown
  • salvin francis
  • Claude Moore

building up a string from many pieces  RSS feed

 
Bartender
Posts: 1810
28
Chrome Eclipse IDE Firefox Browser jQuery Linux MySQL Database Netbeans IDE
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
My Java is rusty or I wouldn't have spent the last hour looking for a solution to this. I'm working on a gateway/adapter from a Java server to a mainframe. Each request must begin with a 307 character header (called ESSH). Other data will be added after the header, but that's a problem for later. The string will look sort of like this:

"C            20190501API_CODE_HERE        X          SOURCE   999        BLAH       XXXXX X       " etc on out to exactly 307 characters regardless of spaces.

The class I'm creating will have a bunch of setters that must place the data at a specific starting position and with a certain length.

I've played around with a fixed length array, an ArrayList, and a StringBuilder but I'm having challenges with each of them. I'm beginning to think that I need to split each string into characters and put them into a character array. But this seems to be getting over-complicated.

I saw one solution that uses MarshallStringUtils to stuff everything into a byte array but that make my head hurt just looking at it.

I'd love to hear ideas on how others would approach this problem. Thanks.
 
Sheriff
Posts: 21719
102
Chrome Eclipse IDE Java Spring Ubuntu VI Editor Windows
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It's probably easiest to use a StringBuilder that you initialize with all spaces. You can then add utility methods that overwrite data as needed. For instance:


You could perhaps use a char[] instead a StringBuilder combined with String's getChars method, but that won't let you add content after the header (unless that too has a fixed size). If you do use a char[] you can use Arrays.fill to set your content to all spaces.
 
Marshal
Posts: 64168
215
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
What about String#format?Make sure the numbers add up to 307.
 
J. Kevin Robbins
Bartender
Posts: 1810
28
Chrome Eclipse IDE Firefox Browser jQuery Linux MySQL Database Netbeans IDE
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Rob.

That's where I sort of ended up by the end of the day yesterday, except I opted for StringBuffer since it's synchronized and this gateway will see lots of traffic. Does that make sense or would it be better to use StringBuilder and synch it myself? Or am I just assuming the need for synchronization?

 
J. Kevin Robbins
Bartender
Posts: 1810
28
Chrome Eclipse IDE Firefox Browser jQuery Linux MySQL Database Netbeans IDE
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Campbell. That is a very interesting solution that didn't even occur to me. I'm going to have to look into that a little deeper today.
 
Campbell Ritchie
Marshal
Posts: 64168
215
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
If yoiu use the StringBuilder as a local variable in a method, then you won't need synchronisation. If your other fields are liable to be accessed by two thre‍ads simultaneously, that race condition may indeed need locking/synchronisation.
 
Saloon Keeper
Posts: 20655
122
Android Eclipse IDE Java Linux Redhat Tomcat Server
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
OR

You could just use an ETL utility, which is what I usually do. My own preference is Hitachi Pentaho, but Talend is popular.

In this case, you can define the record column-by-colum in a column attribute table. The ETL tool takes care of all the grunt work, and it's optimized for large data flows as well.

And if the columns later change, only the column definitions need to be updated, using the GUI job editor. No re-compiling.
 
J. Kevin Robbins
Bartender
Posts: 1810
28
Chrome Eclipse IDE Firefox Browser jQuery Linux MySQL Database Netbeans IDE
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I've never heard of ETL. It looks interesting but I doubt I have time to get it approved and working. Never time to do it right, but always time to do it over, eh?
 
Tim Holloway
Saloon Keeper
Posts: 20655
122
Android Eclipse IDE Java Linux Redhat Tomcat Server
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

J. Kevin Robbins wrote:I've never heard of ETL. It looks interesting but I doubt I have time to get it approved and working. Never time to do it right, but always time to do it over, eh?



ETL stands for Extract, Transform, and Load. A number of DBMS's come with proprietary ETL tools, but Pentaho is one of the open-source, platform agnostic ones and has no pre-requisites other than a JVM, JDBC drivers for the databases that you want to load to or extract from (if any), and maybe a little tweaking for the SWT graphics subsystem, depending on the release installed.

The Pentaho Data Interchange ETL tool is called "Kettle". The complete PDI package includes a DDD (drag/drop/drool) GUI tool (Spoon) for designing dataflows using a pipeline flow chart approach.

ETL specifically refers to:

* Extract - reads data from one or more sources. These can be flat disk files, CVS or Excel files, database tables, cloud data sources and many more. Mix and match as needed.

* Transform - sanitize data. Fill missing field values with good defaults, re-route defective records for external inspection and repair, convert codes into values via table lookups, and, if you like run full-blown JavaScript or Java code on selected fields. You can split fields, merge fields, insert constant-value columns and so forth.

* Load - Once the data has been processed, select one or more destinations for the results to go to.

This sort of pipeline is highly performant. The various steps from source(s) to destination(s) run in parallel, so you don't have to wait for one stage to complete before beginning another. Also, there are multi-processor options for really heavy loads.

There's a lot more. One job I set up years ago runs once an hour during business hours, scoops up all the tables in a DB/2 database, converts them to CSV (doing a few code lookups along the way), then logs into an external Business Intelligence provider, transmitting the set of CSVs as a ZIP file. I haven't touched it in eons and it's still running.

PDI is a little quirky, alas. I've made a few contributions to make it less so, but there's a certain mindset to it. Essentially, the pipeline is a set of parallel pipes, one per column where the timing is such that the column values for a given row all arrive at a pipeline stage at the same time. It's not quite the same as passing records, since columns can be added and removed from the overall flow at any stage and stages can acquire data from and send data to more than one stage.

On the other hand, if you're going to be doing industrial-grade data manipulation in bulk, it's a technology well worth acquiring. You don't need a programmer to design pipelines (as I said, it's all DDD). It's easy to install, and after that, non-technical personnel can handle just about everything.
 
Tim Holloway
Saloon Keeper
Posts: 20655
122
Android Eclipse IDE Java Linux Redhat Tomcat Server
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Some resources (besides the usual online references)


Learning Pentaho Data Integration 8 CE - Third Edition: An end-to-end guide to exploring, transforming, and integrating your data across multiple sources Paperback – December 5, 2017

https://www.amazon.com/Learning-Pentaho-Data-Integration-end/dp/178829243X


Pentaho Data Integration Cookbook Second Edition 2nd Edition (2013)

There are about half-a-dozen books in print for Pentaho DI and the Kitchen suite.
 
J. Kevin Robbins
Bartender
Posts: 1810
28
Chrome Eclipse IDE Firefox Browser jQuery Linux MySQL Database Netbeans IDE
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks for all the info. It may not fit this project but it's a good tool to have in your toolbox.
 
If I had asked people what they wanted, they would have said faster horses - Ford. Tiny ad:
Create Edit Print & Convert PDF Using Free API with Java
https://coderanch.com/wiki/703735/Create-Convert-PDF-Free-Spire
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!