Building Internet-facing websites on the SharePoint platform is different than building intranet solutions. Although in both cases you’re using SharePoint as the framework, building public websites introduces some new challenges.
Robots.txt files are a way to tell Web Robots which part of your site they should skip while scanning it. Although it’s not an official standard using robots.txt is a common approach used for Internet-facing websites to exclude pieces of your website from crawling. Find out how you can easily create and manage the robots.txt file with Mavention Robots.txt.
A part of building an Internet-facing website is notifying your visitors about new content. Using RSS/Atom feeds is a common way to deliver new content to your audience. Find out how we created RSS for our brand new website without a single line of custom code!
Building Internet-facing websites on the SharePoint platform requires you not only to understand how SharePoint works but also to know what the Web is about and how things work there. In this part of the How we did it series about our brand new website I will show you some things that we implemented to make our website a better Web citizen.
A while ago I’ve released Mavention Meta Fields: a solution that makes it easy to add meta tags to Publishing Pages in SharePoint 2010. Shortly after the release it turned out that the dependency on jQuery was causing some problems in integrating the solution with existing environments using different versions of jQuery.