I’ll discuss continuously delivering web and mobile applications in a later post, but since I just recently worked on one I’d like to start with guidance for using powerdelivery to deliver a Business Intelligence (BI) solution. This solution might deliver databases, cubes, packages, reports, or SharePoint dashboards like PerformancePoint.

I’ll assume at this point you have a backlog of user stories that describe business capabilities that are desired. You’ve also had your sprint planning meeting and come up with acceptance criteria for what you’re going to build in your first sprint. You desire to iteratively release pieces of acceptance tested BI functionality. You want to reduce your cycle time, and thus the time it takes to go from an idea for BI functionality, until it is delivered to users.

In a nutshell you want to release to users as quickly as possible after an idea is conceived!

Perform the following steps initially by only deploying to your “Commit” (development) build environment. Once you have it working there, you should immediately escalate it to test, and then to production to get your initial release in place even if it has no functionality yet!

Step 1: Create build account

Have your IT administrator create a user account for builds to run under. Configure Team Foundation Server to build using this account (see the TFS agent service on your TFS agent computers).

Step 2: Procure and secure environment

Identify the environments that will be used to deliver your solution. Determine the desired production environment’s physical and virtual nodes (servers or VMs), and procure a development and test environment with at least one node for each scaled point of the production environment. For example, if you have a load balancer with 5 nodes in production, have a load balancer but with just one node in development and test environments. Ideally, you should setup these computers from scratch with identical OS and middleware (SQL server for example) and lock them down only be accessed by the Active Directory build account. All future configuration changes will be made via your build.

Step 3: Open ports

Have your IT administrator open ports for the TFS agent to perform automation activities. Ports 5985 and 5986 must be open on each computer to be automated (development, test, and production database and SharePoint server computers or VMs for example). You also need to open any ports needed to perform other automation activities and send data between computers. Examples are port 1433 for SQL server, or port 2383 and 2384 for SSAS (multidimensional and tabular defaults, respectively). You may also need to open port 137 to allow UNC paths to be accessed.

Step 4: Create delivery pipeline

Next, use the AddPipeline utility to create builds for your deliverable. You want to create a build for each capability you need to independently deploy. If you have databases, SSIS packages, and cubes that all work together to provide one data mart, web site, or BI capability, you may want to keep these in a single build for simplicity of maintenance.

Step 5: Create UNC shares on environment nodes

Create a UNC share on each of your computers that will have SSIS packages, SSAS models, databases, or other assets deployed to them as files. Give write permission to the Active Directory account your build runs under to these shares.

Step 6: Record your environment configuration

Edit the .csv files for your powerdelivery build’s environment configuration and add name/value pairs for each setting that is environment specific. Include settings for things such as:

  • Database servers
  • Database names
  • Computer names
  • UNC paths
  • Tabular server instances
  • Connection strings

Tip: If you have multiple nodes in production and only one in development and test but need to automate configuration, use a PowerShell array as the value in the .csv file and loop over it in your script for each node. For example in production:

DatabaseServerNames,@(Server1,Server2,Server3)

In development:

DatabaseServerNames,@(DevServer)

Step 7: Identify subsets of data for automated testing

As you identify source data over sprints, create views (if they don’t already exist) to abstract the data and use a database migration technology such as RoundhousE or Visual Studio Database Projects to deploy them with your build. If an existing system that one of your components is sourcing data from has millions of rows, you want to have development versions of these views that only pull a subset of your data suitable for testing. The views in your UAT and production environment will pull the entire data set.

The reason for this is to speed up builds and processing in your commit (development) environment so that automated tests can run and provide feedback faster than what would be necessary when processing the entire set of data. You will need to lean heavily on experts of the data, and business analysts to either identify a subset of data that exists, or script creation in an empty database perhaps of new data suitable for running your tests.

Step 8: Load your environment configuration

Modify your build script to create global variables for all the environment configuration settings you filled out in step 6 in the Init function of the build.

Step 9: Copy assets to the drop location

Modify your build script further by editing the Compile function. Use this function to copy any files you will need during deployment from your current directory (where TFS grabbed a copy of the source code) to the drop location. These files include dlls, SSIS packages, .asdatabase files, .sql scripts, powershell scripts, or any other files that will be deployed to the environment computers.

Step 10: Script environment modifications

If the latest version of your code requires a dependency of a configuration change, use this function to do so.

Examples of things to do here are:

  • Set environment variables. Your SSIS packages or other deliverables will pick up the correct settings for each environment.
  • Use the Azure PowerShell cmdlets to procure an Azure node or database
  • Give security permission to users on SQL databases
  • Run NuGet to install .NET dependencies (log4net, Entity Framework etc.)
  • Apply OS or middleware patches
  • Change security settings of services or file system

Step 11: Script deployment activities

In the Deploy function of your script, you will perform the steps to actually deploy the solution assets. You can only use files in the build’s drop location, so a common problem here will be to try and run something that’s in source control but you didn’t copy in the Compile function. If this happens, go back to step 9.

Examples of things to do here are:

  • Run SQL scripts
  • Deploy and/or run SSIS packages
  • Create and/or run SQL jobs
  • Deploy and/or process SSAS cubes (BISM Multidimensional or Tabular)
  • Start/stop SQL jobs

Tip: Your script is running on your TFS build agent computer. But many of the deployment utilities used for automation, such as dtexec.exe (to run SSIS packages), or Microsoft.AnalysisServices.Deployment.exe (to deploy cubes) will run on the database or other servers. Because of this, you will often need to copy files that are input for these commands to one of the UNC shares you setup in step 5.

Step 12: Script unit tests

In the TestUnits function of your script, you can run tests that make sure that individual units of your solution work independently. You might use MSTest.exe to run Visual Studio database unit tests written in T-SQL.

Step 13: Script acceptance tests

If you automate one kind of test, at least do the acceptance ones. These are what you described to the business during the sprint planning meeting that you’d show. Read my prior post on defining acceptance if you are new to it.

Because most BI solutions use scheduled batch jobs to do processing, you will need to cause jobs or any other sort of processing to run in the Deploy function in step 11 if you want them available to be tested here.

Since powerdelivery is a synchronous script, you have two options for doing acceptance testing:

Inline publishing of tests

With this approach, your script should run the SSIS packages or SQL scripts that a job normally invokes on a schedule and and wait for them to finish. You can then run your tests immediately after and automate publishing their results if you want business users that access the TFS portal to see the results of your testing.

Late publishing of tests

With this approach, you can let the build finish and wait for scheduled SQL jobs or processing to finish by monitoring the process. When it completes, run your tests using Visual Studio manually and publish them to the build they were run against.

Step 14: Automate restoring test databases from production

To make sure that any database alterations made in production are tested exactly as they will work in production, add a step to the Deploy function to take a production backup and restore it over your test databases prior to running these alterations. Only run this step while doing a test environment build. You can check for this using the powerdelivery Get-BuildEnvironment helper cmdlet.

Promoting your releases

At this point, you should get your build to succeed in the Commit (development) environment and then immediately promote it to Test, and then Production. As you plan sprints, use the knowledge you gained setting up the script to help team members estimate the work it will take to make modifications to the build to deploy and change the environment needed by their features. Include this in your estimate for user stories you deliver in the future.

When one or more team members has their feature passing acceptance tests in the Commit environment, promote that build to your Test environment and have it inspected by QA. If it is of sufficient quality, hold the sprint review meeting. If the business decides to release to production, queue a Production build and release it to your users.

Category:
configuration, deployment, environment, process improvement, testing

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 82 other followers

%d bloggers like this: