Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
At HubSpot, we talk a lot about inbound marketing as a really effective way to attract, engage, and delight customers online. But we still get a lot of questions from people all around the world about digital marketing. So, we decided to answer them. Click the links below to jump to each question, or keep reading to see how digital marketing is carries out today.