How to resolve java.lang.OutOfMemoryError error by "java.lang.String", loaded by "<system class loader>" Eclipse Memory Analyzer -
i reading large xml files , storing them database. arond 800 mb.
it stores many records , terminates , gives exception:
exception in thread "main" java.lang.outofmemoryerror: java heap space @ java.util.identityhashmap.resize(unknown source) @ java.util.identityhashmap.put(unknown source)
using memory analyzer have created .hprof files says:
76,581 instances of "java.lang.string", loaded "<system class loader>" occupy 1,04,34,45,504 (98.76%) bytes. keywords java.lang.string
i have setters , getters retrieving values.how resolve issue. appreaciated.
i have done increasing memory through jre .ini. problem doesn't solved
edit: using scireumopen read xml files.
example code have used:
public void readd() throws exception { xmlreader reader = new xmlreader(); reader.addhandler("node", new nodehandler() { @override public void process(structurednode node) { try { obj.setname(node .querystring("name")); save(obj); } catch (xpathexpressionexception xpathexpressionexception) { xpathexpressionexception.printstacktrace(); } catch (exception exception) { exception.printstacktrace(); } } }); reader.parse(new fileinputstream( "c:/users/some_file.xml")); } public void save(reader obj) { try { entitytransaction entitytransaction = em.gettransaction(); entitytransaction.begin(); entity e1=new entity; e1.setname(obj.getname()); em.persist(e1); entitytransaction.commit(); } catch (exception exception) { exception.printstacktrace(); } }
try using parser xml processing.
processing 1 big xml file 800m using e.g. dom
not feasible takes memory.
try using sax
ot stax
in java , process parsing results @ once without trying load complete xml file memory.
and don't keep parsing result in memory in total. write them fast possible database , scope parsing results narrow possible.
perhaps use intermediate tables in database , processing part on datasets inside database.
Comments
Post a Comment