PassMark Logo
Home » Forum

Announcement

Collapse
No announcement yet.

Server connections increase markedly after indexing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Server connections increase markedly after indexing

    Hi

    I ran the v5 indexer last week on a large website and we have since been suffering a major problem with our connections to the 'backend' MySQL database server. Our website apparently suddenly started to request over 50 connections at any one time and the server failed - and is still failing.

    The server people not unreasonably point at the Zoom Indexing as the activity that coincides with the database problems.

    My understanding of Zoom software is it makes a single sweep of the accessible website files (this was done online) and stores all information in one or two of several large files on the webserver. Cgi code is used to interrogate those files (I used the cgi option for the search page) whenever a search is done. No ongoing database connections are needed. Is that a correct assumption?

    Is there any way Zoom could increase the number of database connections needed by a website?

  • #2
    The Zoom Indexer can only make a maximum of 10 connections to the server at any one time - and this is specified by the user on the "General" tab of the Configuration window in the form of number of download threads to use.

    The default number of threads used is only 2 (which would mean only 2 connections to the server at any one time).

    Note that this is only during the indexing process. No connections to the website are made during the searching process.

    I would suspect that your 50 connections are from something else, or an inherent problem with your site's scripting. You should be able to confirm that Zoom is not the cause by lowering the number of download threads used (set it to Single threaded mode even) and also checking when these connections are actually made and from what IP addresses.

    Originally posted by GAtherton View Post
    My understanding of Zoom software is it makes a single sweep of the accessible website files (this was done online) and stores all information in one or two of several large files on the webserver. Cgi code is used to interrogate those files (I used the cgi option for the search page) whenever a search is done. No ongoing database connections are needed. Is that a correct assumption?
    That is correct.

    Is there any way Zoom could increase the number of database connections needed by a website?
    Only during the indexing process, and only by the exact number of threads you allow it to use (up to a maximum of 10).

    Another possibility is that you have a bug/issue on your website's scripting, where some of your scripts/pages make a significant number of database connections (eg. a badly written or complicated query involves connecting to the database 5 times on the one page), then you can imagine 10 connections to a page like that would lead to 50 connections.

    So to summarize, Zoom can only (at maximum setting) put as much load as 10 users simultaneously accessing your website, and only during the indexing process. You can lower the number of threads as mentioned above, and set it to act essentially only as a single user accessing your website.

    It would seem to be an issue worth addressing with your server setup however, if it fails when only 10 users are accessing the site.
    --Ray
    Wrensoft Web Software
    Sydney, Australia
    Zoom Search Engine

    Comment


    • #3
      That confirms my understanding - thanks!

      The server people have 'fessed up' that they cocked up our setting for allowed numbers of concurrent users (50 per hour rather than 50 concurrent!) so that explains why we were offline - Zoom is in the clear

      Many thanks for your time

      Comment

      Working...
      X