A "spider" is a computer program that automatically collects the content from a web site for a search engine. "Spider" is an old term, the current term is "Bot."

The process of "crawling" a site is to ensure that the most recent content is always available to Google and the other search engines. The more popular the site, the more frequently the bots come to scan the content, so as to ensure the most recent content is always in the search engine's index of web sites.

When it says someone is "online" that means that a registered user of this site has logged in recently and may in fact still be viewing content in this site.