Search engines read every page that's on the Internet with programs called "robots," "bots," or "spiders." They're all the same thing. They are considered to "crawl" pages. Bots crawl websites night and day even if they've seen it all before.
This is because webmasters modify pages, delete pages, and add pages. The bots will re-read some websites frequently if they have found that the pages change often.
This might be every few days or, in some cases, every day or even a few times a day. On the other hand, if the bots have found historically that a website doesn't change much, they might not get around to re-reading it for weeks.