• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Application Crashing with Error - java.net.SocketException: Too many open files

 
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Dear All,

Our production retail application in last few days has started crashing with the error as 'ERROR [org.apache.tomcat.util.net.JIoEndpoint] (ajp-0.0.0.0-8009-Acceptor-0) Socket accept failed: java.net.SocketException: Too many open files'. We need dire help to resolve this tricky production issue. To give more information, below is the highlight of the application-

Background:
Application is Web based and was hosted in three separate servers hvaing structure as below-
Web Server having Apache service for Internet Access (Connected to App Server using AJB port 8009)
App Server having Apache & JBoss server for Intranet Access (Connection from Apache to JBOss using HTTP port 8080)
DB Server having Oacle 10g for database

Below is the environment software of all the three Servers

09:35:24,985 INFO [AbstractServer] Starting: JBossAS [6.0.0.Final "Neo"]
09:35:26,393 INFO [ServerInfo] Java version: 1.6.0_24,Sun Microsystems Inc.
09:35:26,394 INFO [ServerInfo] Java Runtime: Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
09:35:26,394 INFO [ServerInfo] Java VM: Java HotSpot(TM) 64-Bit Server VM 19.1-b02,Sun Microsystems Inc.
09:35:26,394 INFO [ServerInfo] OS-System: Linux 2.6.18-194.el5,amd64

Application was hosted and availalbe for access from 10-Oct-2011 and everything was working fine with a load of 50users cosistently till 20-Aug-2012. On 21-Aug-2012, application was no more accessible through Internet/Intranet and when checked in the server log files of JBoss in the path /app/jboss/jboss-6.0.0.Final/server/default/log, in the 5 server log files, below exception is thrown

2012-08-26 02:54:16,871 ERROR [org.apache.tomcat.util.net.JIoEndpoint] (ajp-0.0.0.0-8009-Acceptor-0) Socket accept failed: java.net.SocketException: Too many open files
at java.net.PlainSocketImpl.socketAccept(Native Method) [:1.6.0_24]
at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408) [:1.6.0_24]
at java.net.ServerSocket.implAccept(ServerSocket.java:462) [:1.6.0_24]
at java.net.ServerSocket.accept(ServerSocket.java:430) [:1.6.0_24]
at org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServerSocketFactory.java:61) [:6.0.0.Final]
at org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:343) [:6.0.0.Final]
at java.lang.Thread.run(Thread.java:662) [:1.6.0_24]

Attached is one of server log file for your reference. Also checked the file count through Putty and below is output

[tfouser@abc ~]$ /usr/sbin/lsof -p 26844 -l | wc -l
5
[tfouser@abc ~]$ /usr/sbin/lsof -l | wc -l
1455
[tfouser@abc ~]$ cat /proc/sys/fs/file-max
3238931
[tfouser@abc ~]$ cat /proc/sys/fs/file-nr
2550 0 3238931
[tfouser@abc~]$ ulimit -n
1024

We restarted the JBOss server multiple times, restarted the OS, but still the problem exists.

Please help what and where could be issue. Highly appreciate any insight given on this high critical production issue.

Thanks!

 
author
Posts: 23951
142
jQuery Eclipse IDE Firefox Browser VI Editor C++ Chrome Java Linux Windows
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Vikram Chelar wrote:
[tfouser@abc~]$ ulimit -n
1024

We restarted the JBOss server multiple times, restarted the OS, but still the problem exists.

Please help what and where could be issue. Highly appreciate any insight given on this high critical production issue.



Well, if this is a highly critical production system, then why is it only allowed 1024 file descriptors? That many file descriptors, depending on what the services are doing, can be easily used up with a few hundred users, or even a few dozen users.

How about increasing it?

Henry
 
Vikram Chelar
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Henry Wong wrote:

Vikram Chelar wrote:
[tfouser@abc~]$ ulimit -n
1024

We restarted the JBOss server multiple times, restarted the OS, but still the problem exists.

Please help what and where could be issue. Highly appreciate any insight given on this high critical production issue.



Well, if this is a highly critical production system, then why is it only allowed 1024 file descriptors? That many file descriptors, depending on what the services are doing, can be easily used up with a few hundred users, or even a few dozen users.

How about increasing it?

Henry



Hi Henry,

Thanks for the reply. Increasing Ulimit was one of my option, but the customer systems department wants the statistics of determining during application crash that all the 1024 file descriptors are used up.
Please let me know how to track and understand through Putty if all the 1024 are being used.

Thanks!
 
I knew that guy would be trouble! Thanks tiny ad!
a bit of art, as a gift, the permaculture playing cards
https://gardener-gift.com
reply
    Bookmark Topic Watch Topic
  • New Topic