There are 2 key elements that you need to consider and be careful not to confuse them.
1) Creating an Index
Programs called spiders or crawlers follow links from website to website, cataloging the details of each site in an index (or database). The search engine will constantly update this index as millions of new pages are created every day.
2) Searching the Index
When you enter a search term into a search engine, it quickly looks through the index to return a list of suitable websites. It uses an algorithm to determine which sites are most suitable. This algorithm considers whether the site is popular (how many previous visits it has had) and if it is reliable (how many other sites link to it) and how relevant the content is (how many times does the search term appear, is the search term in the URL or the title of the page). Other considerations may include when the page was updated last, what country the site is stored in and the language of the site.
Common Mistake: Do not mix up search engines with how the internet works, they are completely separate concepts.