• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Tim Cooke
  • Campbell Ritchie
  • paul wheaton
  • Ron McLeod
  • Devaka Cooray
Sheriffs:
  • Jeanne Boyarsky
  • Liutauras Vilda
  • Paul Clapham
Saloon Keepers:
  • Tim Holloway
  • Carey Brown
  • Piet Souris
Bartenders:

Hashmap, retrieving value and re-storing with different key

 
Ranch Hand
Posts: 47
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi there,

I have a hashmap and my requirement is that I need to retrieve an object from it (using a key I have) and then make some changes to that (retrieved) object and store it back into the hashmap with a different key. The original object in the hashmap must remain unaffected.

The problem is (as expected) that the original object that I retrieve also gets changed and this is something that I dont want :-(.

The object that I need to retrieve and change does not implement cloneable but implements serializable.

Could anyone please help :-)

Many Thanks!

Gratefully,
Aneesha.
 
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi,

Iam posting the code that you require hope it helps you


 
Ranch Hand
Posts: 5093
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
You will need to clone the object retrieved from the Map and modify the clone.
 
Ranch Hand
Posts: 489
Eclipse IDE Tomcat Server Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Well, the class doesnt implement Cloneable, right ? You will have to implement cloning thru Serialization.
This should be the implementation of your clone() method



This will do. Now get the object from the map, clone it (which will call the method above), make modifications on the clone and then add it as a new entry in the map.

ram.
 
Aneesha Singh
Ranch Hand
Posts: 47
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Many thanks Yacoob, Jeroen and Ram for your help! I have solved the problem by doing what Ram suggested: implementing his solution (using serialization) with a few tweaks to the code he kindly pasted. Works like a charm!! Thanks again!

BTW, there is an article on JavaWorld with this solution's drawbacks and merits thats worth a look - if performance is not a real issue, its a great way to get around having to clone every single class in your application.

Cheers!
 
Ranch Hand
Posts: 1970
1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I'm a bit confused.

I understand that you can make a new copy of a Serializable Object by serialising and deserialising. I would think that you would only want to use this very expensive way of copying if you did not have access to the source code of the Object that you were wanting to copy. But it looks to me as if you are writing a clone() method for your Object that works via serialisation, which implies you can modify the source code.

If you do have access to the source code of the Object that you are wanting to copy, then surely you'd be better to write an ordinary clone() method and declare the Object as Cloneable. I know that people say don't optimise until proven necessary, but copying via serialisation, when you could do an ordinary clone(), is going too far, I think.

Is the explanation that you do have access to the source code of the Object, but not to the source code of its superclass(es) and its superclass(es) are not Cloneable?
 
Ranch Hand
Posts: 323
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
you might not have access to all the source to the object you want to clone. for example, you might have extended a class of someone else's, which doesn't have a clone() method. or, your object might be making extensive uses of other objects as instance variables, and you just don't want to sort out which of those can be clone()d and which can't.
 
Ranch Hand
Posts: 1646
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
...or the superclass might declare a final clone() method that throws an exception?
 
M Beck
Ranch Hand
Posts: 323
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
it is my firm opinion that "final", when used for anything other than literal constants, is evil. in fact, if i ever were to write a style guide, i might throw in a paragraph or two to that effect, i think.
 
Wanderer
Posts: 18671
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
[M]: It is my firm opinion that "final", when used for anything other than literal constants, is evil

Really? Sounds rather extreme to me. I use it very frequently with nonliteral constants, in order to create immutable objects. I can sorta see your point if you're talking about final methods, espaecially if there's no superinterface you can implement instead. But I'd be interested in hearing your rationale for this statement. Perhaps a separate thread would be in order, so as not to detract from the original topic.
 
Aneesha Singh
Ranch Hand
Posts: 47
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Well, Peter .. you are right when you say that its going too far (in terms of the cost of serializing and deserializing instead of writing a clone method) but in my case that doesnt really matter and this is why:

1. Most of the application objects that I am using have already been written without using cloneable/clone. So its a fair bit of work to introduce cloning support and re-test the entire codebase.
2. I am not using the serializing/deserializing for production code but to automate testing data loading to make testing the application easier and faster and make it easier to setup tests.

Keeping these 2 things in mind, it was best to stick with an expensive (in terms of performance time) solution which involved minimum work and minimum tampering with existing code. :-D
 
Aneesha Singh
Ranch Hand
Posts: 47
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
And also as M. Beck says, since a lot of teh code has been written, and uses a lot of objects everywhere, I dont want to sort out which need to be cloned and which dont ... The 'final' clone method is not the issue though ...
 
Peter Chase
Ranch Hand
Posts: 1970
1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Aneesha Singh:
2. I am not using the serializing/deserializing for production code but to automate testing data loading to make testing the application easier and faster and make it easier to setup tests.



Oh, well if it's for testing only, then whatever works is fine.
 
Peter Chase
Ranch Hand
Posts: 1970
1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by M Beck:
it is my firm opinion that "final", when used for anything other than literal constants, is evil. in fact, if i ever were to write a style guide, i might throw in a paragraph or two to that effect, i think.



I don't agree with that at all.

In the past, performance used to be cited as a reason to use "final". It is true that that reason has largely gone away.

But at least one use of "final" for methods is still very valid, I think. That is where you use "final" to say which parts of a class are suitable for overriding, and which aren't. Sometimes, that just helps to make the code more self-documenting and sometimes it is actually necessary, to prevent maintenance programmers doing something really dumb. For instance, sometimes a particular method might be doing crucial operations on some private data. An override couldn't possibly do the right thing, and making the method "final" enforces this.

If a particular method has some code that are suitable for overriding and some that aren't, then a good thing to do is to make that method "final", but to split out the overridable code into another method, which you do not declare "final". This is self-documenting and prevents inappropriate maintenance.
 
M Beck
Ranch Hand
Posts: 323
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Peter Chase:
But at least one use of "final" for methods is still very valid, I think. That is where you use "final" to say which parts of a class are suitable for overriding, and which aren't. Sometimes, that just helps to make the code more self-documenting and sometimes it is actually necessary, to prevent maintenance programmers doing something really dumb. For instance, sometimes a particular method might be doing crucial operations on some private data. An override couldn't possibly do the right thing, and making the method "final" enforces this.



ugh. i posted that original article late at night, when tired; now i'm writing this defense of it early in the morning, when tired. i need to fix my lifestyle, i think.

anyway. i'm not entirely clear on what Jim Yingst meant by "nonliteral constants", but if it's got something to do with immutable objects, i suspect i'd agree with such a usage of "final". that sounds like something i would have meant to include when i said "literal constants", but i was too worn out to phrase myself better... if you're declaring a thing "final" because it would be a clear error in logic under the design rules of your program to ever redefine that thing, or assign it a new value, then most likely i'd have no problem with that. that's what i use constants for, as do we all, i believe.

but i would not agree with declaring methods or classes "final", at very least not if they're visible outside your project. if someone else can see and use the thing, then that someone ought to be able to override or redefine the said thing, at least at their own risk.

i say that for two reasons. one is to do with communication, the other with design principles.

one: "final" is a compiler directive, a thing used to tell the bytecode compiler what to do - so using it to tell humans what's wise and prudent to do is a bad fit for it. if a method or field really shouldn't be overridden or redefined, by all means say so in the javadoc, or in a README, or both - tell your human users that this-or-that part of your code wasn't designed or meant to be changed, and that if they should try to do it anyway, they're on their own and you won't support them.

"final" can't really help you there, anyway. it's only a compiler directive, and the only thing that ever really obeys it is the bytecode compiler; humans can always outsmart compilers, even when they really oughtn't. if somebody really, really wants to extend your "final" class, they'll find a way, by hook or by crook; all your declaration will truly achieve then is to annoy your users.

two: i feel that making something "public" that isn't fit to be overridden is an information hiding violation. things that could badly break your code in unpredictable or surprising ways if they were ever to be modified should be considered implementation details, and hidden away (as far as possible) out of public view; don't expose them to your users. if you do choose to expose something, you should be able to trust your users to modify it.

(if your users simply aren't trustworthy, you've got bigger problems than "final" can fix, anyway. most users of code are adult enough that they can either be trusted to do the right thing, or to take their bruises when they don't; using a compiler directive to deal with such people seems mildly insulting to me, as if you were assuming all your users were mere compilers.)

three: er. my three reasons are... (shades of Monty Python). anyway, thirdly, i feel that "final" - since it's a compiler declaration - is best fit only to prevent people from doing what would be logic errors if they were doable. but that presumes you know what the logic is. when we're talking about constants - either literal primitives, or objects that should properly be immutable - then, usually, we do know, and we can prove that modifying one of those would lead to such-and-such errors in our implementation; "final" is a good match for preventing such errors.

but when talking about extending classes and overriding methods, we can't be sure of that any longer - because the user who wishes to do that may be laboring under entirely different business restrictions; their logic may not be ours. therefore, they should be allowed to change things as they may need to, since we can't predict and accommodate every potential use and application of our code in every situation ever. to attempt to restrict such usage only makes our own code less useful to others, which is a shame. if our code really isn't fit for such extensions - well, document that in a Javadoc and/or README, and let the users take it from there; if they try to extend the code anyway, they're on their own.

they're adults, they can handle the responsibility. if they're not adults... well, then it's unlikely they'll kill anybody by breaking your computer code, anyway, right?
 
Jim Yingst
Wanderer
Posts: 18671
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
anyway. i'm not entirely clear on what Jim Yingst meant by "nonliteral constants", but if it's got something to do with immutable objects, i suspect i'd agree with such a usage of "final". that sounds like something i would have meant to include when i said "literal constants"

Yes, for example:

Bar is never initialized with a literal, and it's not a compile=time constant. But it a constant at runtime, for a given instance. That's what I meant. Sounds like we agree on this point, anyway.

Hmmm, I don't think I really agree on the other points, but don't have time to go through them carefully right now. Later...
 
Consider Paul's rocket mass heater.
reply
    Bookmark Topic Watch Topic
  • New Topic