• Post Categories

  • Browse Blogs

  • Blog Stats

    • 626,196 hits
  • Syndications

    SQLServerPedia Contributor

Configuring BIDS 2008 / 2008 R2 and Visual Studio 2008 to work with Database & Business Intelligence Projects and Team Foundation Server (TFS)

Working with the SSIS, SSAS, and SSRS Business Intelligence projects along with Database projects in the same Visual Studio 2008 development environment can be challenging as there is no clear documentation how to enable both set of projects.  Database projects are supported by Visual Studio 2008 Developer, Professional and Team System Database Editions. The  SSIS, SSAS, and SSRS Business Intelligence projects are supported by the Visual Studio 2008 shell that comes with SQL Server 2008 / 2008 R2.

If you have BIDS 2008 / 2008 R2 installed by itlself and you try to open a database project you get an error saying that this type of project is not supported. You get the same error if you have Visual Studio 2008 installed by itself and try to open one of the Business Intelligence projects. The reason behind these errors is that BIDS 2008 / 2008 R2 does not come with the database project template and Visual Studio 2008 does not come with the SSIS, SSAS, and SSRS Business Intelligence project templates.

So how do you get both set of templates in a single development environment?

In order to get both set of templates in the same development environment, you need to install both Visual Studio 2008 and BIDS in the right order as described in the following steps:

  1. Uninstall all of these if you already have them installed in your environment:
    – Visual Studio 2008
    – BIDS 2008 / 2008 R2
    – Team Explorer 2008
  2. Install Visual Studio 2008
  3. Install Visual Studio 2008 Service Pack 1
    http://www.microsoft.com/en-us/download/details.aspx?id=10986
  4. Install BIDS from your SQL Server 2008 / 2008 R2 installation media

At this point should have a working Visual Studio environment with both set of projects.  A quick way to verify that the installation was successful, is to take a look at the Visual Studio version in the Help|About page. It will show as Version 9.0.30729.4462 QFE as shown in the picture below. QFE stands for Quick Fix Engineering.

VS2008QFE

When you launch Visual Studio 2008 or BIDS 2008 / 2008 R2 you will be prompted to select the default environment settings. Since I work with the Business Intelligence projects most of the time, I select the Business Intelligence environment. This is an option presented only the first time you open Visual Studio. To change this setting, use the Import and Export Settings wizard, which is available on the Tools menu. For more information on choosing and changing the environment settings go to http://msdn.microsoft.com/en-us/library/6k364a7k(v=vs.90).aspx.

Connecting to TFS 2005 and 2008

To connect to TFS 2005 and 2008 you will need to download and install Team Explorer 2008 from http://www.microsoft.com/en-us/download/details.aspx?id=16338.

If your end goal is to be able to work with database projects, business intelligence projects and connect to TFS 2005 or TFS 20008 the complete steps are as follows:

  1. Uninstall all of these if you already have them installed in your environment:
    – Visual Studio 2008
    – BIDS 2008 / 2008 R2
    – Team Explorer 2008
  2. Install Visual Studio 2008
  3. Install Visual Studio 2008 Service Pack 1
    http://www.microsoft.com/en-us/download/details.aspx?id=10986
  4. Install BIDS from your SQL Server 2008 / 2008 R2 installation media
  5.  Install Team Explorer 2008
    http://www.microsoft.com/en-us/download/details.aspx?id=16338.

Connecting to TFS 2010

In some cases, the solutions & projects are 2008 / 2008 R2 projects but the repository is TFS 2010 or TFS 2012. Even if you installed Team Explorer 2008 you will get an error when trying to connect to a TFS 2010 or TFS 2012 server. The reason for this is due to the fact that Team Explorer 2008 does not support full URL paths (i.e. https://myservername/mytfs/mycollection) in the TFS server name section.

To fix this issue you will need to download and install the Forward Compatibility Update Team Explorer 2008 SP1 for Team Foundation Server 2010 (http://www.microsoft.com/en-us/download/details.aspx?id=10834).

Notice that this update can only be applied to Team Explorer 2008 SP1 (Service Pack 1). The curve ball here is that there is no Team Explorer 2008 SP1 available  as a download. In order to turn Team Explorer 2008 into Team Explorer 2008 SP1 is to apply the Visual Studio 2008 SP1 to it. This means that if you followed steps 1 to 4 above, you will need to repeat Step 3 (Re-install Visual Studio 2008 SP1).

If your end goal is to be able to work with database projects, business intelligence projects and connect to TFS 2010 the complete steps are as follows:

  1. Uninstall all of these if you already have them installed in your environment:
    – Visual Studio 2008
    – BIDS 2008 / 2008 R2
    – Team Explorer 2008
  2. Install Visual Studio 2008
  3. Install Visual Studio 2008 Service Pack 1
    http://www.microsoft.com/en-us/download/details.aspx?id=10986
  4. Install BIDS from your SQL Server 2008 / 2008 R2 installation media
  5. Install Team Explorer 2008
    http://www.microsoft.com/en-us/download/details.aspx?id=16338
  6. Re-install Visual Studio 2008 SP1
  7. Install the forward compatibility update for Team Explorer 2008 SP1 for Team Foundation Server 2010
    http://www.microsoft.com/en-us/download/details.aspx?id=10834

Connecting to TFS 2012

To connect to TFS 2012 the following update is required :

Visual Studio 2008 SP1 Compatibility GDR for Visual Studio 2012 Team Foundation Server and Team Foundation Service Preview (http://www.microsoft.com/en-us/download/details.aspx?id=29983).

If your end goal is to be able to work with database projects, business intelligence projects and connect to TFS 2012 the complete steps are as follows:

  1. Uninstall all of these if you already have them installed in your environment:
    – Visual Studio 2008
    – BIDS 2008 / 2008 R2
    – Team Explorer 2008
  2. Install Visual Studio 2008
  3. Install Visual Studio 2008 Service Pack 1
    http://www.microsoft.com/en-us/download/details.aspx?id=10986
  4. Install BIDS from your SQL Server 2008 / 2008 R2 installation media
  5. Install Team Explorer 2008
    http://www.microsoft.com/en-us/download/details.aspx?id=16338
  6. Re-install Visual Studio 2008 SP1
  7. Install Visual Studio 2008 SP1 Compatibility GDR for Visual Studio 2012 Team Foundation Server and Team Foundation Service Preview
    (http://www.microsoft.com/en-us/download/details.aspx?id=29983

Hopefully this post helps you get your environment all squared away.

Advertisement

SQL Server Upgrade error: It is not possible to change the SQL Server features to be upgraded in this release. Validation errors. There are no features selected for upgrade.

On a recent project I was asked to “upgrade” a licensed SQL Server 2008R2 Standard Edition to SQL Server 2008R2 Enterprise Edition. I have done this multiple times with no issues. The “kicker” was that the upgrade required a 48-hour turnaround time but no installation media for SQL Server Enterprise was going to be available in that same timeframe due to a delay in the procurement process.

In order to meet project requirements and gain some time, a team member suggested we go ahead and do the “upgrade” using a downloaded copy of SQL Server 2008R2 Evaluation Edition since Evaluation Edition contains all the Enterprise features that were required. Once the installation media for SQL Server 2008R2 Enterprise arrived, we were to simply upgrade the Evaluation Edition to Enterprise Edition.

My immediate response was that going from a licensed edition to an evaluation edition was not possible. It is not a “natural” upgrade path and is actually considered a “downgrade” as you are going from a licensed edition to an unlicensed edition. As a professional I had to back this statement up with official Microsoft documentation. As a geek no matter what the documentation said I had to try it myself to see what happened. So I did both.

Documentation
A quick search of Books on Line on SQL Server 2008R2 led me to the following MSDN article titled “Version and Edition Upgrades” (http://msdn.microsoft.com/en-us/library/ms143393(v=sql.105).aspx) . This article is great as it outlines all the supported upgrade scenarios for all versions and editions of SQL Server. A quick scroll towards the bottom shows that the only two upgrade paths supported for Standard Edition is Enterprise and Datacenter. As a matter of fact this article shows that there is no support going from any other edition to Evaluation Edition.

Additionally, there are a couple of footnotes at the bottom. An important footnote to take a look at is footnote No. 2. Although the statements in footnote No. 2 are within the context of failover clusters, some of them also apply to stand-alone installations. In this footnote it is clearly stated that SQL Server 2008R2 Standard to SQL Server 2008R2 Evaluation upgrade is not supported.

Test

I wanted to see for myself the error message generated if I tried to carry out the upgrade anyways. Of course I did not try to do this on the actual client’s environment. That is a big no no! I decided to spin up my own sandbox virtual environment. The steps I followed on my test were:

  1. Install SQL Server 2008R2 Standard Edition (Database Services, Analysis Services, SSMS, BIDS)
  2. Reboot
  3. Upgrade SQL Server 2008R2 Standard Edition to SQL Server 2008R2 Evaluation Edition using the Upgrade Setup Wizard from Installation Center.

As I went through the Upgrade Setup Wizard I got the following validation error as expected:

In summary, you cannot upgrade from a licensed SQL Server edition to SQL Server Evaluation Edition. It is perfectly possible though, to upgrade from SQL Server Evaluation Edition to other licensed SQL Server editions. For more details on the supported upgrade paths refer to MDSN article: http://msdn.microsoft.com/en-us/library/ms143393(v=sql.105).aspx.

Master Data Services Error: Resolving from a long-running stucked Staging Batch with Status “Queued to Clear” or “Not Running”

In some ocassions,  you may run into a runaway, long-running or stucked staging batch with Master Data Services in SQL Server 2008 R2. The issue happens most frequently when a fairly large batch of rows are being processed for update or deletion.

The issue has to do with Service Broker, either due to a timeout or notifications not being received or received incomplete.  You can confirm the issue when you query the queue [mdm].[microsoft/mdm/queue/stagingbatch]. You will see one or more messages in this queue. After searching through a couple of forum threads I found the solution to be fairly simple: rollback transactions an re-enable Service Broker. The steps are as follows:

  1. Set the MDS database in single user mode and rollback uncommitted transactions.
    ALTER DATABASE [MDS] SET  SINGLE_USER WITH ROLLBACK IMMEDIATE;
  2. Re-enable Service Broker on the MDS database.
    ALTER DATABASE [MDS] SET  ENABLE_BROKER WITH NO_WAIT;
  3. Set the MDS database back in to “regular” multi-user mode.
    ALTER DATABASE [MDS] SET  MULTI_USER WITH ROLLBACK IMMEDIATE;

Once the MDS database is brought back online, all queued messages should be cleared out and MDS database will be able to communicate normally with Service Broker.

You can check Books on Line for more information on the difference between “NO WAIT” and “WITH ROLLBACK IMMEDIATE” options when issuing the ALTER DATABASE command here. Pinal Dave has a good post about the difference between these two options as well here.

Hope this post has helped you. Your feedback is greatly appreciated.

Using Unary Operators to control Analysis Services hierarchy aggregations

Analysis Services hierarchy aggregations can be easily controlled using unary operators. For example, in accounting there are GL accounts that are grouped in major GL account groups and used in different financial statements like Profit & Loss Statement, Income Statement, Trial Balance, and Balance Sheet. These GL accounts may affect the balances of the major GL account groups and  financial documents differently, adding  or subtracting to the balance.

A simpler example, might involve sales quotas for a Sales Department. For example, in some organizations the Sales Department is broken down into sales teams with salespeople assigned to those teams. The overall Sales Department quota is broken down into smaller quotas among these sales teams and the sales team’s quota is then broken down into quotas assigned to the team members. In some organizations, Sales Managers often assign these quotas evenly throughout the sales teams and sales team members. Others, in order to guarantee a fat bonus check decide to raise the bar and assign sales quotas that are higher to the overall department’s sales quota. Figure 1 below shows an organizational structure chart of our sample Sales Department with sales quotas that don’t sum up evenly.

Figure 1. Sales Department organizational structure chart and sales quotas

As can be seen in Figure 1, the total Sales Department quota is $150,000. The Sales Manager in this case decided to push their teams harder and assigned a sales quota of $60,000 to each team. In a perfect scenario, if all sales teams meet their sales quota, the total sales for the Sales Department would be $180,000, which is $30,000 more than the department’s sales quota. Accordingly, each team, in order to impress their Sales Manager and win a free meal and margaritas at the local Mexican joint, decided to exceed the team’s sales quota by assigning higher personal sale quotas.

In our BI solution, we are required to create an Analysis Services cube in which the Sales Manager can track these sales quotas at each level of the Sales Department organizational chart as seen in Figure1. But, we cannot simply sum up the sales quotas, because the totals would not match up. We could approach these several ways, but our requirement is to replicate Figure 1 with a single “SalesQuota” measure with no MDX or additional measures involved.

The Solution

In order to achieve the requirement imposed above we can make use of a special attribute property in Analysis Services called UnaryOperatorColumn to control how level members of our hierarchy contribute to the aggregated value of the level parent.

In a regular scenario, the parent’s sales quota at each level would be equal to the sum of its children’s sales quota. That would have been the case if the Sales Manager in our example had decided to distribute the overall Sales Department quota equally among each sales team and the individual sales team members would have done the same for their personal sales quota. In other words, each sales team’s quota would have been $50,000 each ($150,000 / 3). For Sales Team A, for example, each sales person’s quota would have been $16,66.67 each ($50,000 / 3).

But in our scenario, the parent’s sales quota at each level should not be equal to the sum of its children’s sales quota. The only solution then, is to create an “artificial” child at each level that holds the parent’s sales quota and specify this “artificial” child to be the only child contributing to the parent’s total. This is possible through the UnaryOperatorColumn attribute property in the Dimension designer in BIDS. In Figure 2 you can see the available Unary Operators that can be used in this property along with the resulting behavior as described in Books On Line (http://msdn.microsoft.com/en-us/library/ms175417.aspx).

Figure 2. Unary Operators.

The UnaryOperatorColumn attribute property value, as the name implies, is a pointer to a table column that holds one of the Unary Operators listed in Figure 2. For our Sales Department quotas example, we would need a column that holds the unary operator for each level member in our Sales Department dimension. The Sales Department dimension source table would look as shown in Figure 3.

Figure 3. Sales Department dimension source table

Notice that this table is a naturalized Parent Child table. Also, notice that an “artificial” child was created with the same name as its parent. This “artificial” child is the child that will hold the parent’s sales quota value and is the only value that will be used in the parent aggregation. The way we control this aggregation, is by assigning to this artificial child the ‘”+” Unary Operator and the rest of the children the “~” Unary Operator.

At the lower level in our hierarchy all the way to the right in Figure 3, each sales person is assigned a “~” as its Unary Operator  in the SalesPersonUnaryOperatorColumn and only the “artificial” child receives a “+” as its Unary Operator. The same applies for the Sales Team level in our hierarchy, only the “artificial” child is assigned the “+” Unary Operator.

The FactSalesQuota table that holds the sales quotas would look as shown in Figure 4:

Figure 4 FactSalesQuota fact table

The Analysis Services project

Once we have defined our underlying table structure, we can take a look at the Analysis Services project. Figure 5 shows the basic project definition with the Sales Department Dimension and Sales Quota measure.

Figure 5. Analysis Services project definition

The Sales Quota Measure

The sales quota measure is a straightforward column based measure. It is based on the Fact Sales Quota table and is a simple SUM aggregation. Figure 6 shows the sales quota measure definition.

Figure 6. Sales Quota measure

The Sales Department Dimension

The Sales Department dimension is a simple dimension with the necessary attributes needed to design a drilldown hierarchy. Figure 7 and 8 show the Sales Department dimension definition, user hierarchy and attribute relationships.

Figure 7 Sales Department definition

Figure 8 Sales Department attribute relationships

A very important step in any user hierarchy definition is to specify the key columns at each level. In this case, the only attribute that needs a composite key column definition is the Sales Team attribute. The key column definition is shown in Figure 9.

Figure 9 Sales Team attribute key columns

The deployed cube with the Sales Department definition provided so far would look as shown in Figure 10.

Figure 10. Deployed cube

Notice that the “artificial” children show up and that the children’s sales quotas are being added to the parent’s aggregated amount. To fix the aggregation issue we need to define the UnaryOperatorColumn attribute properties for the children level members. We can hide the “artificial” children by changing the HideMemberIf user hierarchy level property to Parent.

Figure 11 and 12 show the values for the UnaryOperatorColumn property for the Sales Person and Sales Team attributes.

Figure 11. Sales Person UnaryOperatorColumn value

Figure 12. Sales Team UnaryOperatorColumn value

Figure 13 and 14 show the user hierarchy level property HideMemberIf

Figure 13 Sales Person user hierarchy level HideMemberIf property

Figure 14 Sales Team user hierarchy level HideMemberIf property

The Results

Once these properties have been changed the resulting deployed cube should look exactly as required. Figure 15 shows the exact aggregation behavior as in Figure 1.

Figure 15 Final results

Conclusion and Considerations

The solution presented above provides the exact results as dictated by our requirements and is meant as a means to exemplify and understand how unary operator can be used to control aggregation behaviors in user hierarchies.

There are other options to provide the same results, but this solution requires no calculated members or MDX at all. While this solution works, you need to keep in mind that there are some performance consideration regarding unary operators and parent child hierarchies. The key is always to TEST! TEST! TEST!

Sample Files

You can download the project files used in this post here.

Dynamically generate current Year, Month or Date member with MDX

MDX can be extended with Visual Basic functions like the FORMAT() and NOW() functions to dynamically generate the current year, month or date member. For example, it can be used as part of a named set to generate current year budget and actuals. It can also be used as part of a filter in a Performance Point dashboard.

First, a little review of VB NOW() and FORMAT() function:

NOW()

The NOW() VB function returns the current system date and time. It is similar to the T-SQL GETDATE() function.

FORMAT()

The FORMAT() VB function returns a string formatted according to instructions contained in a format String expression.
Source: http://msdn.microsoft.com/en-us/library/59bz1f0h(v=vs.90).aspx

There is an extensive list of characters you can use to create your own date and time formats. You can see the complete list here: http://msdn.microsoft.com/en-us/library/73ctwf33(v=vs.90).aspx

For example, you can extract the year portion of your date in the following formats:

y: Displays the year number (0-9) without leading zeros. Use %y if this is the only character in your user-defined numeric format.
Example: FORMAT(NOW(), “y”)
Result: 1 (for any date in 2011)

yy: Displays the year in two-digit numeric format with a leading zero, if applicable.
Example: FORMAT(NOW(), “yy”)
Result: 11 (for any date in 2011)

yyy or yyyy: Displays the year in four-digit numeric format.
Example: FORMAT(NOW(), “yyy”) or FORMAT(NOW(),”yyyy”)
Result: 2011 (for any date in 2011)

FORMAT() and NOW() with MDX

The following MDX example uses the Adventure Works 2008R2 Analysis Services OLAP cube. Suppose we need to return all Internet Sales for the year 2011. The MDX script would look something like this:

SELECT [Measures].[Internet Sales Amount] ON COLUMNS
FROM [Adventure Works]
WHERE [Date].[Calendar Year].&[2011]

This works just fine and will return the aggregated sales of 2011. (Note: the Adventure Works 2008R2 does not contain sample sales data for 2011, result will be an Empty dataset).

But what if instead of a specific year, you are asked to return the Internet Sales amount for the current year? You have 3 options:

  1. On January 1st at 12:01 AM every year you need to change your MDX script to the new current year.
    CONS: You might be passed out from celebrating New Year’s Eve and most probably you will not get to it until a few days out into the new current year.
  2. You could use some MDX functions like LasNonEmpty and LastChild to figure out the last amounts recorded by the most recent date.
    CONS: You may have forecast data or simply bad data with transactions occurring on dates out into the future. (Trust me, I’ve seen it)
  3. You can build a dynamic MDX script that will return the current year based on the system datetime.
    CONS: It may be too sexy for your cube.

The trick is to construct the Date dimension member using the STRTOMEMBER MDX function. The STRTOMEMBER MDX function stands for “String to Member” and as the name implies, it will convert a string to a dimension member. In this case, we want to generate the [Date].[Calendar Year].&[2011] dimension member from a string using the Year portion of the system datetime.

The dynamic string will look like this:

StrToMember(“[Date].[Calendar Year].&[“+FORMAT(NOW(), “yyyy”)+”]”)

Now we can use it in our MDX script:

SELECT [Measures].[Internet Sales Amount] ON COLUMNS
FROM [Adventure Works]
WHERE StrToMember(“[Date].[Calendar Year].&[“+Format(now(), “yyyy”)+”]”)

The same applies if you want to select or filter data for the current month. In the following example I’m be using the Month level from the Calendar Hierarchy and specifying the month of July (7) for the year 2011:

SELECT [Measures].[Internet Sales Amount] ON COLUMNS
FROM [Adventure Works]
WHERE [Date].[Calendar].[Month].&[2011]&[7]

We can generate the current month member as follows:

SELECT [Measures].[Internet Sales Amount] ON COLUMNS FROM [Adventure Works] WHERE StrToMember(“[Date].[Calendar].[Month].&[“+Format(now(), “yyyy”)+”]&[“+Format(now(), “M”)+”]”)

Note that to specify the Month portion we use CAPITAL letter M because it is case-sensitive. Smallcase letter m stands for minute. The complete list of characters can be viewed here: http://msdn.microsoft.com/en-us/library/73ctwf33(v=vs.90).aspx

Afterthoughts: SharePoint Saturday Tampa 2011

This past Saturday June 11th, 2011 I had the opportunity to co-present at SharePoint Saturday Tampa thanks to a special invitation by its main organizer Michael Hinckley (Twitter|Blog|LinkedIn). It was a great experience and a very well-organized event. I met several SharePoint professionals I follow on Twitter for the first time and learned more about them and others in the industry that are in a few words, Rockstars!

Speaker Dinner & Event Networking
As a speaker, I was invited to attend the speaker dinner at Donatello Restaurant (website) on Friday evening June 10th.  The place was a great choice as we had the lounge & bar area mostly for ourselves. Food was excellent. At the dinner party I had the opportunity to connect with several friends from the area, some which I had not seen personally for a while such as

Michael Antonovich  (Blog|Twitter|LinekdIn),
Assistant Director of Web Services at Orange County Public Schools

Landon Bass (Blog|LinkedIn),
Systems Engineer III – SharePoint Engineer at Raymond James

Chad Miller (Blog|Twitter|LinkedIn),
Sr Manager Database Administration at Raymond James

I also met some other cool guys like

Chris Schwab’s (Blog|Twitter|LinkedIn),
Sharepoint & PowerShell guru

David McNamee (Blog|Twitter|LinkedIn),
SharePoint Technology Specialist at Microsoft

Michael Oryszak (Blog|Twitter|LinkedIn)
Practice Manager at Intellinet, SharePoint Server MVP, MCITP

Marc D. Anderson (Blog|Twitter|LinkedIn),
Co-Founder and President at Sympraxis Consulting LLC

The Presentation
Michael Hinckley and I co-presented “Extending Business Intelligence Solutions with SharePoint 2010,” in which we did an overview of what SharePoint 2010 has to offer for Business Intelligence. Michael did a great job doing an overview of SharePoint 2010 covering deployment considerations, licensing and edition comparisons. I then proceeded to do quick overview of Business Intelligence, what it is and what it is not. The audience was very engaged and we spent at least 10 minutes discussing the BI self-delivery model and the governance necessary around it to ensure the “single version of the truth.”

Click here or on image above to download the presenation

David McNamee from Microsoft attended the presentation and contributed to the discussion regarding governance and the self-delivery model. He made a statement that resonated a lot with me as an IT professional regarding these politics and battles between IT and the business:

..if IT and the Business don’t talk and collaborate and the business does not consider IT as a strategic partner, then I’m in the wrong organization.

This is very true and agree 100% with David. As a matter of fact, a big reason why I left a past employer was because of these unhealthy politics.

At the end  of the presentation I showcased Excel Services, Performance Point Services and one of the coolest features in SharePoint 2010 Decomposition Trees. We got great feedback and hopefully Michael and I have an opportunity to co-present again at other events. I felt Michael and I had a good synergy co-presenting (even with no rehearsal).

This may be a taste of things to come between the synergy the Tampa Bay Business Intelligence User Group and the Tampa Bay SharePoint User Group.

 The Sessions I attended
I attended just a couple of sessions as I spent most of my time networking with attendees, speakers and sponsors and promoting our own upcoming SQL Saturday #86 BI Edition on November 5th.

While I am not a SharePoint Developer (yet), I really enjoyed Marc D. Anderson’s presentation “Developing in SharePoint’s Middle Tier.” I learned about his awesome SPServices jQuery library publicly available at Codeplex (http://spservices.codeplex.com/) and also about his SharePoint XSL templates also available at Codeplex (http://spxslt.codeplex.com). Through his jQuery library and XSL templates you can easily customize the user experience of SharePoint forms like cascading dropdown menus, selectable textbox word hints, real-time data validation and much more. Really cool stuff.

During lunch time I attended a session / discussion by Kathy Malone (Meetup) and Blain Barton (Blog|LinkedIn) titled “Building Social Media Communities; How SharePoint Fits In.” This was an interesting discussion as it involved one of my favorite topics: Social Media. We discussed Twitter and how it can help startups and technical communities. I added my two cents ($0.02) and explained how great the #sqlhelp hashtag helps a lot of SQL Professionals with immediate answers from experts around the world. We discussed that for SharePoint there is #sphelp available. I was able to walk away with a signed copy of “Office and SharePoint 2010 User’s Guide” authored by my good friend Michael Antonovich. You can buy his book here.

Another great session I attended was presented by Ryan Morgan (Blog|Twitter|LinkedIn) titled “Silverlight Dashboards in SharePoint 2010.” I have not developed anything at all with Silverlight but was interested to learn how complex or easy it was to create custom dashboards in SharePoint 2010 with Silverlight. It turned out it was very simple, or at list Ryan was very effective at simplifying things that even a Silverlight newb like me understood it. I also learned how easy it is to query SharePoint lists using the API’s available using jQuery and LINQ. Very cool stuff.

I look forward to speaking and attending another SharePoint Saturday. There is so much to learn from the experts that speak at this event that I can apply on my line of work which is Business Intelligence. As Michael Hinckley and I have discussed, there is so much overlap between SharePoint 2010, SQL Server and Business Intelligence and the fact that Microsoft is focusing its efforts to make SharePoint the preferred enterprise information management platform, that SQL/BI and SharePoint folks need to engage more and more.

SSRS Error: Report Preview displays Access to the path ‘…\bin\Debug\Report.rdl’ is denied

While working on Reporting Services at a client site I came across this error when trying to preview a report in BIDS:

The report preview failed because the report could not be built. Read the errors, warnings and messages in the Error List window for specific build failures.

An error was raised in the Error List window below with the following message:

Access to the path ‘C:\Users\Jose\Documents\Visual Studio 2008\Projects\Report Project\bin\Debug\Report.rdl’ is denied

The root cause why this happens is unclear to me, but it seems that the Debug folder’s permissions get changed to Read Only for no apparent reason as can be seen on the screenshot below:

The workaround is to uncheck the Read-only attribute checkbox for the entire Debug folder or simply delete the Debug folder. BIDS will recreate this folder when you click on the Preview tab. If I come up with the root cause of this issue I will update this post.

The development environment is:
– VMWare virtual machine
– Windows Server 2008-R2 64-bit
– SQL Server Reporting Services 2008-R2 64-bit

SSRS errors: (rsReportServerDatabaseUnavailable) The report server cannot open a connection to the report server database.

Sometimes you may get this error when trying to launch your report server webpage:

The report server cannot open a connection to the report server database. A connection to the database is required for all requests and processing. (rsReportServerDatabaseUnavailable)

While the error is descriptive enough to let you know that Reporting Services could not find the report server database, it may be a challenge to some who have no idea where this database is and what is it used for. To your surprise, there is more than one database involved.

Reporting Services requires two databases to run:

  • ReportServer
  • ReportServerTempDB

In short, the Report Server database is needed to store report definitions, report history and snapshots, report models, shared data sources, resources, scheduling and delivery information, metadata and several other objects. ReportServerTempDB is where Reporting Services stores session and execution data, cached reports, and work tables that are generated by the report server. For more information about Reporting Services database visit: http://msdn.microsoft.com/en-us/library/ms156016.aspx.

To solve the error mentioned above you need the following information:

  1. What is the SQL Server Reporting Services instance you are trying to reach?
  2. Where are the SQL Server Reporting Services databases hosted?

Assuming you have the right permissions, you will need to run the Reporting Services Configuration Manager and point to the Reporting Services instance as seen on the image below. You would normally find  Reporting Services Configuration Manager under Start->Programs->Microsoft SQL Server 2008 R2->Configuration Tools

Once you connect to your Reporting Services instance through Reporting Services Configuration Manager, go to the  Database tab located on the left pane to view the name of the SQL Server instance where the ReportServer database is being hosted and the name of the ReportServer database as seen in the image below.

Keep in mind that in some environments the ReportServer database may be hosted in a separate SQL Server database instance and the ReportServer database may have been renamed to other than the default “ReportServer” database name. You will usually find this type of configuration in a scale-out architecture. For more information on scale-out architecture visit: http://sqlcat.com/technicalnotes/archive/2008/06/05/reporting-services-scale-out-architecture.aspx.

Once you located the SQL Server database instance where the ReportServer and ReportServerTempDB database are hosted you need to follow this checklist:

  1. Is the SQL Server database instance running? Is the SQL Server service started?
  2. Can you verify connectivity to the SQL Server database instance?
  3. Are the ReportServer and ReportServerTempDB databases attached and online?
  4. Does the Reporting Services service account have read/write access to the ReportServer and ReportServerTempDB databases?

Once all of this is verified to be true and issues have been fixed, than the last step would be to restart the Reporting Services instance. If everything is right you should be able to access your reports. It is highly recommended to frequently backup both the ReportServer and ReportServerTempDB databases. If you lose you ReportServer database you may re-deploy reports, data sources and datasets, but you will lose all subscriptions, schedules and report parts that users may have created.

Also, keep in mind that the ReportServerTempDB database does not behave like the SQL Server TempDB. Per MSDN Books on Line:

“Reporting Services does not re-create the temporary database if it is missing, nor does it repair missing or modified tables. Although the temporary database does not contain persistent data, you should back up a copy of the database anyway so that you can avoid having to re-create it as part of a failure recovery operation.” For more details visit: http://msdn.microsoft.com/en-us/library/ms156016.aspx

 

Lessons from my first week on Twitter.

I finally gave in to the peer pressure of social media and started tweeting and following fellow twiterers in the SQL/BI arena. The first set of questions that popped-up on my head when I made the decision to start my Twitter account were:

  1. What am I going to tweet about?
  2. Who will care about what I got to say?
  3. Will following too many people cause an information overload?
  4. Should I limit my tweets to SQL/BI only matters?
  5. Will I be lost with all the lingo and abbreviations as in SMS texting?
  6. Will I become addicted once again to something non-productive?
  7. If I have survived life so far without Twitter, why do I need it now?
  8. If I complain about not having enough time balancing work, family and professional development, how can I justify twittering?

Although,  haven’t been able to answer all of these questions, I decided to bite the bullet and get on Twitter. The closing case to become part of the Twitter gang was  made by Jorge Segarra @SQLChicken during a SQL Server R2 / Sharepoint 2010 presentation during the April 5th Tampa Bay SQL BI User Group meeting. Jorge’s statement about being a “legitimate business and professional tool for DBA’s” was heard by my boss as well, who I had invited to attend this particular meeting. Now I can justify my tweetering and blogging at work. Yey!

My next challenge was to justify my twittering at home. Of course, I wear the pants at home but…it is my wife who choses which ones should I wear, when and where. Her resistance was as expected and she just made me realize that I have become what I have always criticized her young sister about…a texting junkie! OK, so less twittering at home I concede. See what I talk about deciding when and where? She just knows how to keep the family harmony.

Eighty (80) tweets later within a week, tweeting  as @sqljoe , I have learned a couple of things:

  1. Twitter feels much like those old IRC chats in  which you can filter messages by sender and topic (#channels).
  2. Twitter is a great way to get in touch with respected SQL/BI experts such as seasoned DBAs, developers, authors and folks from Microsoft.
  3. Interesting to learn what other DBA’s are doing in their day to day tasks, challenges and their approach, as well as learning about the technologies they play with at their jobs.
  4. Twitterers and bloggers are an important part of technology marketing strategies to get the word out quickly about new products and features with minimum effort and cost, other than fine wine and limos (#sqlr2 Microsoft tweetup).
  5. There are SQL chefs, midnighters, chickens, sheeps, chicks, bucks,  knights, doctors, m-a-chanics, soldiers, runners and of course regular joes and green handed masters.
  6. SQL Twitterers use the word Pimpin a lot.
  7. Body disposal services can be contracted through Twitter.
  8. SQL experts have hair feuds as well.
  9. Margaritas make you enjoy progress bars way too much.
  10. DBAs do have lives and are a good source of recommendations for places to eat.

 Among other things,I think my first week on Twitter was with lack of better terms:  interestingly fun, cool and informative.

%d bloggers like this: