Tuesday, 28 December 2010

Creating a User-Defined Server Role in SQL Server “Denali”

“Denali” is the code-name for the next release of Microsoft SQL Server, and a community technology preview (CTP) is available for download from here. My colleague Geoff Allix has already posted a couple of articles about the enhancements Denali includes for debugging Transact-SQL scripts here and here, and as the Content Master data platform team continues to investigate the CTP, I’m sure more posts will appear. In this post, I want to discuss a new feature that makes it easier to delegate server-level administrative tasks – user-defined server roles.

If you’re familiar with previous releases of SQL Server, you’ll know that there are essentially two levels of security principal within SQL Server (well alright, 3 if you include the operating system) – server-level principals, such as logins, and database-level principals, such as users. Permissions can be granted to these principals in order to allow them to use or manage resources (generally known as securables) at the relevant level. For example, you can grant permissions on server-level securables (such as endpoints and certificates) to server-level principals, and you can grant permissions on database-level securables such as (tables and views) to database-level principals. Obviously, managing permissions for individual principals can become complex (and error-prone) as the number of principals increases, so in common with most software systems, SQL Server supports the idea of grouping principals into roles, enabling you to grant the required permissions to the role, and simply add or remove principals from the role in order to allow or disallow them access to the securables.

So far, so ordinary.

Previous releases of SQL Server included a pre-defined set of server-level roles and database-levels roles that are already granted commonly required permissions, and to which you can simply add your principals (for example, logins at the server level or users at the database-level) in order to quickly enable people to access the resources they need while maintaining the principle of “least privilege” (i.e. not granting any permissions to anyone who doesn’t require them). Additionally, you can create your own user-defined database-level roles but crucially, until SQL Server “Denali” you could not create your own user-defined server-level roles.

To understand how the ability to create and manage your own server-level roles is useful, let’s consider a scenario where a corporation uses a SQL Server instance to host multiple application databases. Many of these databases are used by internal “home grown” ASP.NET Web applications or client/server applications that use Windows integrated authentication, and to control access to these databases, the DBA has simply created logins in SQL Server for the appropriate Windows Active Directory groups. However, the environment also includes a couple of off-the-shelf applications that do not support Windows-integrated authentication, and therefore require their own SQL Server logins. Let’s also suppose that these applications are supported by team of dedicated application administrators who need to be able to manage the SQL Server logins for the applications, for example to periodically change the password.

To accomplish this, I can create a user-defined server role by right-clicking the Server Roles folder in SQL Server Management Studio and clicking New Server Role, as shown below. Alternatively, I can use the new CREATE SERVER ROLE Transact-SQL statement.

Picture1

Using the SQL Server Management Studio UI reveals the New Server Role dialog box, enabling me to define the server role. In this case, I want to create a role named SQLAccountsAdmin, which will be owned by the built-in sa login. I can also specify the server-level securables I want to assign permissions for, and I can select each securable and set the required permissions. In this case, I’ve selected the AcctsPackage and AppSvcAccount logins (yes, principals can also be securables!) and granted the full set of available permissions on these logins to the SQLAccountsAdmin role.

Picture2

To grant permissions to a user-defined server role by using Transact-SQL, you can use the GRANT, DENY, and REVOKE Transact-SQL commands just like you would for any other server-level principal.

Now I need to add some server-level principals to the role, so that they can use their role membership to gain the permissions required to manage the two SQL Server logins. You can do this on the Members tab of the dialog box or by using the ALTER SERVER ROLE Transact-SQL statement.

Picture3

Finally, it’s worth noting that you can nest user-defined server roles within other server-level principals, including the fixed server roles provided out-of-the-box by SQL Server. In general, I’d advise against this as you can often find yourself granting unnecessary and unintended permissions, but it’s shown here for completeness.

Picture4

So, there you have it – user-defined server roles in SQL Server “Denali” provide a flexible way to delegate administrative tasks at the server-level.

Friday, 24 December 2010

Installing SharePoint 2010 on Windows 7

I generally do most of my development and “technology exploration” in an environment that reflects the actual production environment as closely as possible – for example, by developing against multiple virtual servers running Windows Server 2008 in a domain configuration. This approach has the advantage of reducing the opportunity for “well, it works on my laptop” style configuration issues when trying to deploy the application into production, but, let’s be honest, it makes life difficult – especially when the “real world” configuration requirements are as onerous as those of SharePoint-based solutions.

Microsoft has documented a way to deploy SharePoint 2010 on a single Windows 7 (or Vista if you prefer) development box, so when I recently needed to do some basic SharePoint development, I decided to ignore my existing virtualized, multi-server SharePoint development and testing environment, and try out Microsoft’s instructions for creating a single-box development environment. For the most part, this went OK, but I did hit a few issues along the way, so I thought it might be useful to document my experience.

First, I installed Windows 7 (64-bit, since SharePoint is 64-bit only!) and then downloaded Microsoft SharePoint Foundation 2010. The download is an executable named SharePointFoundation.exe, which you can simply run if you intend to install on the supported Windows Server platform, but which you need to extract to the file system in order to install on Windows 7 (or Vista). For example, to extract the installation files to a folder named C:\SharePointFiles, I used the following command:

SharePointFoundation /extract:c:\SharePointFiles

Next, I needed to edit the config.xml file provided with the SharePoint files, and add a <Setting> entry to enable installation on a client OS, as shown below:

Picture1 

The SharePoint installation files include a tool to automatically install and configure SharePoint prerequisites, but this only works on the supported Windows Server OS – you can’t use it on Windows 7, so you need to install and configure the prerequisites manually. The first of these is the Microsoft Filter Pack, and it’s included in the extracted files, as shown here:

Picture2

Links to the remaining prerequisites are in the Microsoft documentation, and I simply downloaded and installed the ones I required for SharePoint Foundation on a Windows 7 machine (which included the Sync Framework, the SQL Server 2008 Native Client, and the Windows Identity Foundation).

Next I needed to enable all of the IIS features that SharePoint requires. Microsoft provide the following command, which you can copy to a command prompt window (on a single line) and execute.

start /w pkgmgr /iu:IIS-WebServerRole;IIS-WebServer;IIS-CommonHttpFeatures;
IIS-StaticContent;IIS-DefaultDocument;IIS-DirectoryBrowsing;IIS-HttpErrors;
IIS-ApplicationDevelopment;IIS-ASPNET;IIS-NetFxExtensibility;
IIS-ISAPIExtensions;IIS-ISAPIFilter;IIS-HealthAndDiagnostics;
IIS-HttpLogging;IIS-LoggingLibraries;IIS-RequestMonitor;IIS-HttpTracing;IIS-CustomLogging;IIS-ManagementScriptingTools;
IIS-Security;IIS-BasicAuthentication;IIS-WindowsAuthentication;IIS-DigestAuthentication;
IIS-RequestFiltering;IIS-Performance;IIS-HttpCompressionStatic;IIS-HttpCompressionDynamic;
IIS-WebServerManagementTools;IIS-ManagementConsole;IIS-IIS6ManagementCompatibility;
IIS-Metabase;IIS-WMICompatibility;WAS-WindowsActivationService;WAS-ProcessModel;
WAS-NetFxEnvironment;WAS-ConfigurationAPI;WCF-HTTP-Activation;
WCF-NonHTTP-Activation

This enables the required features, which you can verify in the Windows Features Control Panel applet as shown below:

Picture3

Now I was ready to install SharePoint Foundation. I ran Setup.exe and chose the Standalone installation option:

Picture4

After installation is complete, I was prompted to run the SharePoint Product Configuration wizard, and this is where the wheels fell off! The Standalone installation of SharePoint includes the installation of a SQL Server 2008 Express database server instance (named SHAREPOINT) to host the configuration database, but somewhat annoyingly, you need to apply the Microsoft SQL Server 2008 KB 970315 x64 hotfix before you can run the configuration wizard. However, even after doing this, I still found that the SharePoint Products Configuration wizard failed to connect to the database server in order to create the configuration database. In desperation, I upgraded the SQL Server 2008 Express instance that had been installed to SQL Server 2008 R2 Express – still no luck.

My investigations resulted in finding a number of useful blog articles, which are listed below – none of these actually solved my specific problem, but they contain some really useful tips!

After some poking around, I discovered a command-line version of the configuration wizard in the C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\BIN folder named psconfig.exe, and by examining its parameter info I discovered a standaloneconfig value for the cmd parameter, as shown below:

Picture5

This seemed to solve my problem, and I now have a fully configured SharePoint Foundation 2010 environment on a Windows 7 virtual machine, as shown below.

Picture6

All-told, it took me the best part of an afternoon to create my “simple” SharePoint development environment – but to be fair, a large percentage of that was spent scrabbling around to try to figure out how to get the configuration wizard to work. Hopefully, your installation will go a little more smoothly!

Happy Holidays!

del.icio.us Tags:

Monday, 6 December 2010

Another Cloud

Windows Azure and Amazon EC2

Having spent some time working with Windows Azure, I wanted to take a look at some of the other cloud environments out there to get a feel for how they work and how they differ in approach. The first platform I decided to take a look at was the Amazon Elastic Compute Cloud (EC2).

Amazon’s cloud offering is a little different from Microsoft’s — where Windows Azure is a platform and a specially designed framework that allows you to run specially written applications in the cloud, Amazon EC2 allows you to run standard operating systems virtual environments on Amazon’s servers. So two trade-offs spring immediately to mind:

  1. Windows Azure offers you a single fixed environment as against EC2’s almost completely free choice of operating systems (including Windows). Note that the latest Windows Azure release also includes Virtual Machine Roles in addition to the existing Web and Worker roles, so that you can run your own virtual machines in the cloud.
  2. The Windows Azure platform manages all the scalability issues for you (because of the features built in to the platform), whereas with Amazon EC2 you have to do a lot of the work if you want to build a scalable application that can run across multiple virtual machines. Although Amazon does offer an auto scaling service that can start up (or shut down) virtual machine instances for you based on demand, and a MapReduce service (for a description of the MapReduce algorithm and how to implement it in Windows Azure, see here) that’s designed to process large amounts of data on demand.

That said, there are a lot of similarities between the two platforms as I’ve outlined in the following table:

Windows Azure Amazon Web Services Notes
Content Delivery Network (CDN) Amazon CloudFront Both provide high-speed edge caches for static data, used for example to host video or other media for your cloud application.
Windows Azure Table Service Amazon SimpleDB Schema-less table storage.
SQL Azure Amazon Relational Database Service (RDS) SQL Azure is SQL Server in the cloud, Amazon RDS is MySQL in the cloud.
AppFabric Service Bus Amazon Simple Queue Service and Amazon Simple Notification Service Hosted queue services enabling  computers to exchange data through a cloud-hosted message hub.
Windows Azure Connect Amazon Virtual Private Cloud Creating virtual private networks that connect on-premises computers with your cloud instances.
Windows Azure Blob Storage Amazon Simple Storage Service (S3) Facility to allow you to store arbitrary data in the cloud.
Windows Azure Drive Amazon Elastic Block Store Storage that can be formatted and used like hard drives by your cloud application.

 

Daily News

Earlier this year I purchased a Kindle ebook reader which has been fantastic as a way to carry around books and reference material. However, one area that I was slightly disappointed with was the subscriptions to newspapers and journals that are available on the Amazon site. I soon found that I could generate my own news digests from just about any source by using an open source tool called Calibre. Once I customized the news feeds that I wanted to read on my Kindle, I can use Calibre’s command line interface to generate the file containing my news and email it direct to the Kindle. This is all great, except for the fact that I need to have the machine that generates the Kindle news feed using Calibre running. Most of the time it is running, but if on occasion I’m away from home without my laptop, it would still be great to get my daily fix of news delivered to my Kindle.

Calibre is very smart in the way that it generates ebooks containing news if you don’t mind doing a bit of python scripting, so I wanted to carry on using Calibre. Running Calibre on an Amazon EC2 virtual machine seemed like a good way to automate sending out daily news from an always on machine, so this gave me a reason to investigate how easy this would be to achieve with Amazon EC2.

Setting up my Cloud Machine with Amazon EC2

After signing up for EC2, the first decision was what operating system to use. Amazon currently has an AWS Free Usage Tier offer, which is only free if you use a Linux operating system, so Linux it was. However I was then faced with choice of several hundred different base virtual machines of various different flavours of Linux. Ubuntu seemed to be the most popular, and a bit of googling soon revealed which were the “official” Ubuntu machine images.

AMIs

Running my instance of Ubuntu on Amazon’s servers was a simple as selecting the base machine image and clicking the launch button in the web console (making sure I used a micro instance to make sure I stayed on the free usage tier). The Public DNS value is the machine’s DNS name.

MyInstances

The next step was to connect to my virtual machine, which involved some security configuration. First of all I needed a key and this was generated for me when I launched the virtual machine, secondly I needed to open up the virtual machine’s firewall to allow me administrative access so I added an entry on the Security Group page to enable SSH.

SecurityGroup

My only stumbling block came when I tried to connect to the virtual machine using Putty as an SSH client in Windows in that Putty didn’t recognize the key that EC2 had generated for me when I launched the virtual machine. It turned out that I needed to convert the key to a different format by using Puttygen. With that sorted out I could run a command shell on the virtual machine, and copy files to and from the virtual machine using PSCP.

Installing Calibre on Ubuntu turned out to a single command:

sudo apt-get install calibre

Finally I could set up a scheduled command using crontab to generate and email my Kindle newsfeed every day at 6am.

Conclusions

To summarize what I learnt from my first use of Amazon EC2:

  • Setting up a virtual machine in the cloud is very straight-forward with the Amazon Web Services web-based management console. It also looked as if would be quite simple using the command line tools.
  • Choosing a suitable base operating system is more difficult. Someone else has installed the OS and a selection of software before you start, you really need to know your way round the OS to be sure that it’s secure and properly configured. In fact you probably want to install it yourself, which is possible, but a bit more complicated. Also, it’s down to you to make sure everything is kept up to date with patches etc.
  • Given the choice of operating systems available, you can run just about any piece of software you like (even applications with GUIs if you use technologies like Remote Desktop or VNC). However, there’s no guarantee that it will scale — in order for an application to scale it must be able to run in multiple virtual machines simultaneously, and probably be designed to use one or more of the scalable storage services like Amazon SimpleDB or Amazon Simple Storage Service.

Friday, 3 December 2010

Where Next with the Cloud?

I’ve spent most of this year embedded with a team run by Eugenio Pace in the patterns & practices group at Microsoft working on three books (with more to come). The first two have already been published — see the links on the right. You can also view the content on MSDN:

The first book includes an introduction to the Windows Azure Platform, and then describes how the fictional Adatum company migrates its existing ASP.NET expense reporting application to the cloud. The book looks at the mechanics of how Adatum performs the migration as well as examining the significant design decisions made by Adatum, the trade-offs it had to consider, and the cost implications. For example, the original, on-premises application used SQL Server as its data store. Adatum had to decide whether to go with SQL Azure for the cloud-based version of the application, which would be simple to implement, or expend more development effort to port the storage functionality in the application to Windows Azure table and blob storage.

The second book describes a “green field” scenario where the fictional Tailspin company is developing an online surveys application. With a new application, Tailspin in not constrained by any existing design decisions or implementation choices, but can chose which features of the Windows Azure platform to use. The design decisions addressed in the book include how to make the Surveys application a multi-tenant cloud application, and how to make the application scale on demand (for example to handle a customer creating a survey that they expect to get a million responses to in the week before Christmas). The Tailspin Surveys application will make a reappearance in a forthcoming book on Windows Phone 7 development, where a Windows Phone 7 device will become a client application enabling users to complete surveys on their phone.

Both books also have companion, downloadable code that you can use to explore exactly how these two companies chose to implement their applications for the Windows Azure platform, and hands-on labs that will guide you through some of the specific areas of the the implementations.

Next year, there will be third book on Windows Azure that will provide coverage of some of Windows Azure platform functionality not used by Adatum and Tailspin, for example the Access Control Service, and bring things up to date with some of the new features appearing in in the Windows Azure platform.

Thursday, 2 December 2010

Serious Guidance for Serious SharePoint Developers



Earlier this year, I was lucky enough to spend several months working with the patterns & practices team at Microsoft, producing guidance for developers and architects on working with SharePoint 2010—check out the Developing Applications for SharePoint 2010 pages on MSDN. The guidance includes:


  • Documentation that provides deep technical insights into core aspects of SharePoint 2010 development
  • Deployable reference implementations with a realistic level of complexity
  • "How To" documentation on tricky tasks
  • A library of utility classes that you can use in your own SharePoint applications

The guidance is primarily aimed at experienced SharePoint developers and architects who want a more sophisticated level of guidance. Rather than explaining the basics of SharePoint development, it aims to give you the information and resources you need to make effective architectural and implementation decisions. For example, the topics on sandboxed solutions provide a detailed insight into how sandboxed assemblies are loaded and executed, how resource monitoring and throttling criteria are applied, and how low-privileged process accounts together with CAS policies restrict the functionality of sandboxed solutions. In light of these insights, it goes on to explore exactly what you can and can't do with sandboxed solutions. Patterns & practices go to great lengths to make sure their guidance has real world relevance—the core team included two SharePoint MVPs, Todd Baginski and Rob Bogue, and every chapter and component was reviewed by an external "advisory council" consisting of leading industry figures in the world of SharePoint.


We've now distilled the core content from this guidance into a new book, Designing Solutions for SharePoint 2010. If you're new to SharePoint development, this probably isn't the book for you. However, if you already know your SPSite and your SPWeb and you're looking for deeper technical insights, then this could be a valuable addition to your bookshelf.