Focusing on setup and development using SQL Server products from 2000 to 2008 and beyond. Also about anything geeky that compels me to write a blog entry.

Monday, December 28, 2009

Redirecting SSIS Data Errors to Avoid Package Failure

When setting up a new Integration Services package, is it often advantageous to leave your errors untrapped so your package crashes and you can interpret the errors. But after you get past the first wave of setup, it is often best to trap for additional data errors in your code to prevent complete package failures.

So how do you do it? I'll cover it in broad strokes.

I start with my source table structure. In order to capture error output, I want to create a table that matches the field names of the source table, but not the data type. The idea is to have a table that is so wide open, it can handle any data you throw at it.

For example, the source table, lets call it T1 has these fields:
ID Int Not Null
Name varchar(30) Not Null
Addr varchar(40) Not Null

To handle error output, you would create a table (T2) with these fields:

ErrCode varchar(100) Null
ErrColumn varchar(100) Null
ID varchar(100) Null
Name varchar(200) Null
Addr varchar(200) Null

Let's go over the noteworthy differences. First I added the ErrCode and ErrColumn fields which will be populated with data from the SSIS error handling.

Most importantly, I changed the data types on all fields to varchar fields and made sure the size was big enough to handle even bad data. Depending on your situation, you may adjust these sizes much larger to accommodate the worst possible over sized data records.

Lastly, I allowed Nulls in all of the columns to ensure that every field of data in the row with errors can be stored in the error table, regardless of Nulls.

Now that the structure is in place, it is time to add to the SSIS package. In the Data Flow where you are pulling from your source table, in our case the nondescript T1 table, we must add the error output. Within the data flow section, drag a destination object from the toolbar that connects to your new T2 table. Then click on the Source object that connects to T1 and drag the red line from the bottom of the object to the T2 Destination box.

If you edit the Error Output in the T1 Data Source, you can specify what should be done with each column of data. You can have a column cause the package to fail on error, ignore the error, or redirect the error output, which is the option we want to use. By redirecting all columns on error, they become available to consume in your T2 table data destination.

Now that the red line is attached, you can edit the properties in your T2 destination to insure that the ErrCode, ErrColumn, and all other fields are mapped properly from your error output to your T2 fields. Once you have the mappings all established, you should be ready to run.

To me the great benefit of this type of error handling comes in dealing with bad data that you are pulling from production databases into secondary aggregate, data warehouse, or reporting databases. It allows you to get most of the data into the primary tables and the leisure to review the bad results afterward. Of course, if the data you are pulling has data requirements that would make partial data useless, you must consider this when considering the error output option. Regardless, using error output is an excellent way to troubleshoot as you are developing a new SSIS package.

Tuesday, December 22, 2009

SSIS Package Credentials

This is a great article on setting up SSIS Package Credentials, which is more involved than the old DTS days. If you need to schedule your SSIS packages, read this article and it will save you a lot of troubleshooting time.

Friday, December 18, 2009

Creating a Reporting Service Report Template

I read this good, simple process for creating a template for your company reports that was worth sharing.

Friday, December 11, 2009

Netbook upgrade to Windows 7 with USB key

Great tip on how to setup your thumb drive for loading Windows 7. Is it time for a netbook upgrade?

Wednesday, November 25, 2009

Building My First Website Using Microsoft Expressions

In years past, I have used MS Front Page to develop several websites, both personal and as a consultant. In the last few years I have only built ASP.NET web apps until this last weekend. I dipped my toe back into the website design pool using a tool that was new to me, Microsoft Expressions.

Expressions has many similarities to Front Page, but it goes further in mixing the graphical design tools with a light layer of the .NET controls to extend your site's functionality. In my case I just needed to build a basic website without any data interaction, at least at this point. I found the templates, menus, and toolbars to be well laid out and fairly intuitive.

I was able to get up and running and finish the first draft of a full website in around ten hours never having seen the interface ahead of time. To me that is the best indication of a well designed product.

The dynamic web templates allowed me to easily manage the top banner, navigation menu, and side bar separately from the page content on each page throughout the site. I did discover some hangups with the template approach. After developing eight or nine pages, I realized I needed to add a new editable zone to the template. I quickly discovered that this is a major issue once you have already applied your template to one or more web page.

Overall I give Expressions Web a big thumbs up as an excellent web development tool, and I look forward to using more of the functionality over time.

Saturday, November 14, 2009

Conditionally Blocking Scheduled Reporting

As is true for many DBAs, I have developed a custom set of SSIS packages along with a database that manages scheduled Reporting Services reports. It delivers to a mixture of email recipients and FTP locations after generating the output PDFs and Excel results.

Then came the inevitable weekend of trouble. The set of SQL Server 2005 databases, which provide the data for the reports, failed to load due to trouble with the source legacy systems. However, since the data in the SQL Server databases still contained yesterday's data, the report schedule had no problem completing the scheduled reporting.

Therein lay the problem. I did not want any reports to generate, or worse yet email to recipients, if the recent day of data was not included.

The solution is fairly simple. The SSIS packages that load the SQL Server databases were logging to a table, and included a special status column. When each database was finished loading and had completed any subsequent updates required, a log entry with a status of DONE and UPDATED were entered. I just created a stored procedure to scan the log for databases that were scheduled to be loaded, filtered by UPDATED, and with a log time that fit into the correct date/time range for the current day's data load.

If the stored procedure found any records that did not meet the above criteria, it returned a result reporting accordingly. Then I just added a call to this stored procedure in my primary scheduled reporting stored procedure, where the list of scheduled reports to process is generated. If the data load check reported problems, I prevent the scheduled reports from processing. Then once the problem with the data load is resolved and all data is available, all I have to do it run the SQL Agent job that runs the scheduled reports and this time it will succeed.

I'm always a fan of a little more work in the back end to make a little less work for the routine. Not to mention, you don't have to answer Fred's email asking, "Why did I get two emails of the same report, and which one is right?"

Sunday, October 18, 2009

A New Tablet Computer

This is a super cool new kind of Tablet computing on the horizon. I can easily see how this could change the delivery mechanism for text books on college campuses in time, besides the obvious business uses.

Researching and Bypassing DNS Issues

So we have almost finished transition three out of four. Move to a domain - check. Move to a new file/print server - check. Number three was the biggie.

Move the servers to an off-site data center - almost check.

We moved thirty seven servers to a data center last weekend, changed all of the external and internal IP addresses, and planned to be up and running on last Monday morning. I can hear you chuckling as you read, and yes it was probably too ambitious, but that was the directive so we did it.

DNS issues, both internal and external, are one of the big problems in a transition like this. The DNS server must be properly updated with all IP changes, the local office DNS server must be updated as well, and the external registrar must be updated with new external IP addresses, which can take a few days to propagate.

So if you are like me, a non-networking guy supporting the transition, what are some of the best ways to work on connectivity issues for users? Start with DNS. Here are some of the main testing steps I took to evaluate problems.

1) Ping the name you are trying to reach. If it resolves to the correct IP address but times out, there may be a real connection issue. If it resolves to the old IP address or something else unexpected...
2) Flush the DNS. From the command prompt, just run "ipconfig /flushdns" and it will remove any cache of DNS resolution so you can verify the IP is coming from the latest changes on the DNS server. Also this is a good time to check the DNS1 and DNS2 IP addresses in the NIC properties to verify the PC/Server is pointing at the correct DNS server. Then Ping again. If you get the same problem, move on to the next step.
3) Ping the IP address. If you can't Ping it, it doesn't always mean anything since some servers are configured to prevent pinging them. But if you know you could ping it from another machine, you should be able to ping it from the current machine.
4) Check DNS Suffixes. If you have multiple domains like we do, during this type transition you could have some resources showing up under the wrong zone due to config errors. So perhaps now is setup under If you add DNS suffixes to the suffix list in the NIC properties, you can allow more opportunity for short names to resolve. By that I mean when you commonly go to SERVER1 rather than typing, the suffix list will kick in to help.
5) For file shares, use the IP. If you can't map a drive or reach a resource through \\SERVER1 then just type \\IPADDRESS and see if you can reach the file share. When DNS issues are in play, often it is best to just work around it until the dust settles. Make sure you document the places where you do this type of thing so you can go back and change them to names when the issues are resolved.
6) If you must, change the hosts file. I save this for a last resort, but especially on servers, it is sometimes necessary in order to get processes working quickly. The hosts file (on Windows XP/2003, find it at Windows\system32\drivers\etc\hosts) allows you to bypass the DNS server. You type in an IP address followed by the name you want to use to resolve to that IP address. For example, adding " MyPC" to the hosts file means that if you ping or reference "MyPC" it will resolve to IP It is like a personal DNS server for your PC/Server managed only by you.
WARNING: If you make changes to this file, you are overriding the real DNS server so do it with caution, and preferably only do it on a temporary basis. If you leave in your changes, you risk having the IP changing again in the future, the DNS server updating, and you never getting the changes.

Those are some of the highlights of repeating issues and resolutions I used this last week. Hope it helps someone else in their time of transition.

Wednesday, September 30, 2009

Handling Layered Transitions

We've all been there. The company is adding a new domain and the PCs and servers must be moved to the domain. A new file server is replacing the old one, and everyone must be remapped to use the new server. The servers are being moved to an off-site data center, and all the external IPs must be rerouted. The company is moving offices.

Any of these transitions can cause stress in the office. In my case, we are going through all of the above.

So how do you keep your sanity and avoid self-destructing during these stressful times? Here are a few tips I am trying to follow.

1) Always remember the pressure that everyone else is feeling. This means you can't take things personally if emotions run high, as they are bound to do. Try to be the calm voice of reason.

2) Find out how you can help rather than complaining about mistakes. When everyone is overwhelmed by the pressures of the transition, it is inevitable that some details will be overlooked and some mistakes will be made. What is important is that you contribute to the solutions.

3) Put in the hours that are needed to get the job done. I like my home life as much as anyone, but these transition times are time limited, they don't go on forever. If you put in extra time to make it go more smoothly, and your co-workers do the same, you will all have a much smoother transition.

The bottom line to me is that major transitions are always tough, and sometimes you just have to gut it out.

Sunday, September 20, 2009 Ups and Downs

I have been using for about three weeks now to access my PC. As a quick overview, allows you to remotely access your PC from any computer with an internet connection using a web browser. You install a local service on the PC that you are logging into which talks to the logmein server, thereby resolving your IP address even if you have a floating IP on a home internet connection like DSL.

This isn't anything new, and seems to be the most heavily marketed player at this point. However, offers a free option with limited functionality. Basically, you can login and use your PC but you can not transfer files or print locally.

You can register multiple PCs in your one website account, thereby simplifying your remote access needs.

Overall I have to say I am very impressed. Typically I have used Windows remote desktop to connect to my PC in conjunction with a VPN connection. Remote desktop is great but I hate dealing with the slow speed of the VPN, and the VPN drop outs.

With logmein, when I first login the screen resolution is sized to fit the browser and it maintains the remote PC's aspect ratio. This can be very handy for quick access to do something and then log back out. A complaint I have always had about remote desktop is that the screen will resize if you are logging in from a PC that uses smaller screen resolution than the remote PC. This can be a slow and sometimes messy process.

On the flip side, my beloved shortcut keys don't work through Logmein, while they do work with remote desktop. Honestly though, this is the main shortcoming that I have found so far. Overall, I'm a fan, and I think I will continue to be a user.

Thursday, September 17, 2009

Windows Shortcut Keys

Sometimes when I use Windows shortcut keys, it reminds me of the good old days of Wordperfect. Ok, I take that back. Trying to remember those hotkeys was miserable. That doesn't change the fact that there are a few Windows shortcut keys that I use every day and don't want to live without.

Here are a few of my favorites.

WindowsKey + E
I hate digging for Windows Explorer in the Start Menu or right-clicking the Start Menu to explore. Give me a quick Win+E and I'm in business.

WindowsKey + R
Quick access to the Run... box makes for a quick "cmd" or "regedit" or whatever the occasion calls for.

WindowsKey + L
I usually Lock my PC when I wander away for lunch or at the end of the day. While you can get there from CTRL + ALT + DEL, this is even faster.

Fast access to the Task Manager so I can see which program is hammering the CPU or eating all the memory.

The old classic which I mainly use for a password change.

Feel free to comment with any of your favorite shortcut keys.

Monday, September 14, 2009

Domain Move Part 2

After the first attempt at moving a SQL 2005 Server from one domain to another, I learned several lessons to help me make the permanent transition to the new domain. Here are a few of the key things I checked and/or changed.

- Verify that the new DNS server has entries for the short names of all servers that SQL Server needs to access. I found that several servers would resolve to [servername].[full domain name] but not to simply [servername]. I had to request CNAME entries for all the resources that required (due to extensive existing code) the ability to access the server through the short name.
- Make sure all services are running under local accounts.
- Make sure the Reporting Services execution account is running under a local account.
- If any objects are owned by domain level accounts, change ownership to local accounts.
- If any SQL Agent Jobs are scheduled to run under domain accounts, change them to local accounts.
- Check SSIS packages for any domain account ownership and change to local account.

I still had a few issues, but for the most part I was able to change one SQL 2005 Server from one domain to another and add a SQL 2005 and a SQL 2000 server to the domain from a workgroup with little trouble after checking the previous list and making changes as needed.

If you have any more tips, please feel free to leave comments to flesh out the topic.

Thursday, September 10, 2009


One week ago, an article I wrote on FILESTREAM in SQL 2008 was published on This was the first time I had written for a professional publication, online or in print, and I want to thank Steve Jones for giving me the opportunity.

In case you missed it, here is a link to my article.

I highly recommend that those of you with any inclination toward documentation or training start writing articles. It not only forces you to dig deeper on a subject than you may have done otherwise, but it also helps to establish you as a professional in the field. I know we all have busy lives, but I think it is worth it. Only time will tell.

And speaking of Steve Jones, check out his recent blog entry on Technical Article Writing where he lists several good places to submit your work.

Wednesday, September 9, 2009

T-Mobile's Cheap Internet and Mobile Blogging

I am using a 30 day trial of T-Mobile's web plan for cell phones. This plan includes unlimited web access but does not include downloads or a text plan. The up side... it only costs $10/Month.

So far I am very pleased. In fact I am writing this blog on my Shadow phone at a time that would usually be lost to simple waiting. Since I am used to the two letters per key typing method, I can crank out the words at about 60% of keyboard typing speed.

The strange thing is that T-Mobile limits this cheap web plan to phones that it considers non-smart phones. If you have a G1 you have to use at least the basic $25/Month data plan. Even stranger is the fact that they don't consider a phone like the Shadow, which runs Windows Mobile 6, a smart phone. But hey, I'm not complaining.

This little phone is still not as efficient as the ever popular netbooks, but on the flip side I always have my phone. Mostly I think it fulfills a role that I require within the ever broadening array of Internet enabled devices.

Thursday, September 3, 2009

SSIS Packages fail on ODBC connections after .NET Framework 3.5 SP1 or 2.0 SP2

If you work with ODBC connections through third-party ODBC drivers such as MySQL, you may have already experienced the pain of this Windows Critical Update. This is my tale of woe.

I was innocently applying critical Windows Updates to one of my SQL Servers, one is the .NET Framework 3.5 SP1. I read the details and saw that it also includes SP2 for both .NET 2.0 and 3.0, but still it is a critical update. As always, I there may be some bumps in the road but I figure I will just work through them. Updates applied. Basic database and reporting all test fine. So far so good.

The next morning, all of my SSIS packages that pull data through ODBC connections have failed, and as a result all of my subsequent scheduled reporting has also failed. That is what I call having Monday on Wednesday.

I know what happened due to my daily email report of all jobs from the last 24 hours, sorted by the errors first. In checking my SSIS log files, I find this error:

OnError,[Server],[UserAccount],[TableName],{GUID1},{GUID2},9/1/2009 11:15:48
PM,9/1/2009 11:15:48 PM,-1071607767,0x,SSIS Error Code
(31)" failed because error code 0x80131937 occurred, and the error row
disposition on "output column "[ColumnName]" (1848)" specifies failure on error.
An error occurred on the specified object of the specified component.
There may be error messages posted before this with more information about the

After some research, I find that this is a known issue on an msdn blog. It seems that .NET Framework 2.0 SP2 changes the way that the OdbcDataReader checks for Nulls, resulting in compatibility issues with some ODBC drivers. It impacted both the MySQL and CVIEW (for ISAM connections) ODBC drivers that I was using to pull in all my data.

As explained in the excellent blog post, the preferred option is to add error output to your SSIS packages, thereby handling the bogus errors. Unfortunately I have six or seven packages with twenty or more DataReaders in each package, so it would not be a quick fix. The blog mentions option #2 is to roll back your updates, which for me was the only option. I had production people waiting.

Due to the 3.5 constraints, I had to remove .Net 3.5 SP1, then .Net 3.0 SP2, and finally .Net 2.0 SP2. Then, because 3.0 and 2.0 updates are slipstreamed into the 3.5 SP1 update, at that point I had zero .Net frameworks installed. I had to reinstall .Net 2.0 and SP1, which is all I need for my SQL 2005 and related functionality. It took roughly 20-30 minutes to uninstall and reinstall along with reboots, not including any download requirements.

Afterward, I started playing with the website and realized the IIS default page was not even available. I looked at properties in the default website, looked at the ASP.NET tab... bingo, the ASP.NET version is set to... blank. I click the drop-down and see .Net 2.0 is available, but knowing that the scope of the settings goes far beyond this setting, I went back to the command line.

I knew I needed to run regiis, but I don't do this frequently so I searched, found a refresher, and ran the following command:

C:\WINDOWS\Microsoft.NET\Framework64\v2.0.50727\Aspnet_Regiis -i

Because I am on a 64-bit server, I was wroking in the Frameword64 subdirectory, but if you are on 32-bit, it is the same but you browse to the Framework subdirectory.

Now my baseline ASP.NET is configured in IIS. I browse to the "/reportserver" directory in IE8 and get a (403) Forbidden error. That's an improvement.

I looked around in various places at various settings for a bit, just trying to help me think. In the Reporting Services Configuration Manager, it hit me that I had just made a mess of IIS and in the process the virtual directories had been through the ringer. So under Report Server Virtual Directory and Report Manager Virtual Directory, I checked the "Apply default settings" checkbox and applied the changes.

Boom, back in business. Reports were up and running and I was able to run my SSIS package to generate all my scheduled reports. Then I remembered one other little ASP.NET application that I wrote for managing the report schedules. Tried to pull it up in the browser and it didn't open.

After looking at IIS settings for a while, I finally found the problem. So I right-click on the Virtual Directory, go to Properties-->Documents tab and default.aspx is not in the list. In the process of resets, apparently the default content page list had been reset on my virtual directory. I added default.aspx to the list and boom, everything is back up and running.

I hope that at some point Microsoft makes allowances for this issue in their .Net updates but in the meantime, doing without .Net 2.0 SP2 is fine by me.

Saturday, August 29, 2009

First Attempt At Changing Domains on a SQL 2005 Server

Well I have finished my first pass at moving one of our three SQL Servers to our new domain. I had to roll it back for the moment, but I learned a great deal in the process. This server runs on Windows 2003 Server 64-bit and SQL 2005 Enterprise Edition 64-bit. This server also runs Reporting Services and Integration Services packages to run scheduled reporting, so there are a number of things to test.

1) I changed the DNS Server IP addresses in the network settings to point at the DNS server which is part of the new Active Directory.

2) I changed the domain from the current domain to the new domain and completed the required reboot.

3) I setup security groups from the new domain on the SQL Server and assigned them the appropriate permissions. By assigning the domain groups as logins, it simplifies adding and removing employee permissions by making it a simple Active Directory group addition or removal for the user account.

4) I could connect to the server from my local PC (which I had already moved to the domain) from Management Studio. That gave me comfort that basic name resolution and integrated security were working properly.

5) From an IE browser, I tried to pull up the reporting services browser. No luck. In researching it, I found that the encryption key was not working after the domain change. I also found that a domain account from the old domain was setup for "Execute as" permissions. I attempted to change this to a new domain account but it failed. Tried restoring the backup of the encryption key, failed. I tried creating a new encryption key, failed.

Basically everything I tried to get Reporting Services up and running under the new domain failed. I was able to join the old domain and reset Reporting Services back to a working state so I can do some more research on the issues I encountered. I am considering moving the server to workgroup security and getting everything working right first, and then joining the new domain. Until I do some more research I am not sure.

Based on month end and some other priorities, I probably will not get another attempt at this until the middle of September. If anyone reading this has some advice or some good articles/blog entries on the subject, I welcome them.

Tuesday, August 25, 2009

Preparing To Join A Domain and Change Domains

Currently I am managing three production SQL Servers:
SQL 2000 on Windows 2003 with Workgroup security.
SQL 2005 on Windows 2003 with Workgroup security.
SQL 2005 on Windows 2003 with Domain/AD security.

We just spent the weekend moving all the PC's to a brand new Domain controller with a new Active Directory. The servers are next.

I have been surprised at how little is on the web about joining an existing SQL Server installation to a domain, presumably because most people start on one or never move to one. There is a little more on changing domains, but it still seems a little sketchy.

So sometime in the next week, all three servers will be members of the new domain. These servers use IIS, heavy Reporting Services, heavy Integration Services, and extensive cross server connectivity.

Hopefully I will have a good report, but either way I will let you know the good, bad, and ugly.

Sunday, August 16, 2009

Books on SQL 2008

From what I have seen there are are two basic types of tech learners: the book learners and the "learn as you go" crowd. I lean toward the learn as you go crowd in that I enjoy jumping in with both feet and playing around with a product, then reading about it later. The playing around gives me more context for the books.

Anyway, I picked up a few books to start reading up on SQL 2008 to augment my playing around. At the office I have Microsoft SQL Server 2008 Management and Administration since initially management and administration will be my focus for work. I have barely started it, but the chapter overviews look promising.

At home I have two more books. I started reading Microsoft SQL Server 2008 Internals because from what I have read online, it is an excellent book for really understanding the guts of how SQL Server works. I'm only in Chapter 1, but my quick overview of the book looks promising.

The third book, which I have not started, is Microsoft SQL Server 2008-Database Development which is preparation for the 70-433 MCTS exam. I have never received a Microsoft certification, but I think it is time that I start down that road. Due to my current title of Database Developer this seemed like a good MCTS to start with. Since this one is the least important for me ramping up the installation, setup, and maintenance of SQL 2008, it will probably be the last book I read.

I'll post my impressions of the books as I finish them. And if you have any good book recommendations, feel free to post them in the comments.

Wednesday, August 12, 2009

Ready for Windows 7

I know patience is a virtue, but it's a virtue that I have in short supply. After having tested Windows 7 in beta and after installing RC a dozen times for production, I am ready to make the switch to Windows 7.

Everyone has their own reasons for liking Windows 7, but mine are pretty simple.

1) It boots faster than Vista and XP. Not a big deal to everyone but makes a difference to me.
2) From what I can tell, it either manages memory much better or it does a better job of prioritizing tasks. The end result is better performance even with less memory than on Vista.

Combine that with the great ease of networking a Windows 7 PC and the huge increase in security over XP along with more stability than Vista, and we have a winner.

I just wish it had not taken so long for Microsoft to churn out another OS as good as the beloved XP. Fortunately the new Toshiba laptop I just picked up at Best Buy (and am typing on at this moment) has a free upgrade to Windows 7. The wait is almost over.

Monday, August 10, 2009

Business Objects report scheduling

Part of our enterprise includes a Business Objects data warehouse and a host of Business Objects reports. I have mostly dealt with SQL Reporting Services or MS Access reports so I was excited recently to be asked to reschedule all the Business Objects reports.

Here are a few things I found noteworthy.

In addition to normal interval based report scheduling (i.e. daily, weekly, monthly, etc.) you can schedule based on a custom calendar. You create a named calendar and actually select the specific days that you want to schedule. It allows a high level of flexibility if your requirements are irregular.

The output options are pretty standard, PDF, Excel, and a web format, but you can easily use variable placeholders in your naming to customize the file name, email subject/message, etc. which simplifies and standardizes your file naming and/or emails.

One thing which is a problem with the scheduling, as with most out of the box solutions, is that you can not easily re-run a whole schedule day of reports. You must run the reports individually if there are problems. This is also true in SSRS reports which is why I wrote a database and SSIS packages to automate daily processing of scheduled reports. Maybe I'll blog about that another time, but it would take many posts to cover.

Overall, I am impressed with the Business Objects report scheduling ease of use and flexibility. Hopefully I'll get a shot at some report revision soon and I'll let you know how that compares to SSRS report design if I do.,

Wednesday, August 5, 2009

Fixing a Reporting Services Report that belongs to someone else

It is a common occurance in a shop with multiple report developers and without version control. Someone goes on vacation, you get a call that the report is giving an error, and then what? You don't have the local Visual Studio project or rdl file to edit. No Source Safe to pull the file from. What do you do?

Login to your reporting services host at http:\\[hostname]\Reports to get a list of folders and reports. Navigate the directory structure to find the offending report. Once you find it, you need to copy it to the local PC.

If the "Show Details" button appears on the right hand side of the sub-toolbar, click it. When it reads "Hide Details" you are ready to move on. Now click on the icon in the "Edit" column (just after the check-box). Now under the "Report Definition" section you will see an "Edit" link. When you click the "Edit" link, you will get a save dialog box and can save the report rdl file to your PC. My recomendation is to save it first to your desktop.

Then open SQL Server Business Intelligence Studio (i.e. Visual Studio) and either open a project that you use for researching that person's reports or start a new project to use for researching other people's reports. In the Solution Explorer, right-click on the Reports folder and select Add-->Existing Item...

You navigate to your Desktop (or wherever you saved the rdl) and select the rdl you just downloaded. Now it is a part of your project and you can look at data sources, evaluate issues, and exercise your genius to solve the problem.

And if the solution does not lie in the data sources, you can always setup the Target report folder, server, etc. to post changes to the actual rdl file. If you do this, I recommend holding onto that copy of the rdl that is still sitting on your desktop because it makes for a good backup in case you make things worse and need to back track.

Thursday, July 30, 2009

Google Voice

I am seriously geeking out about Google's new Voice service. I signed up to be part of the early testing period and after about an hour, I am hooked and amazed.

First of all, notice at the top of the right-hand column on my Blog homepage the new "Call Me" link with a phone icon. You can type in your name and phone number, Google Voice will call your phone, and when you answer you will be connected to my voicemail (or my phone if I had it set that way). You don't even have to dial the number!

I was able to choose a phone number in my area code and to search for a phone number with a word or series of letters that I would like to appear in my phone number. The Google Voice page does a search to see if a number with that word is available and if so, it appears on the screen. I tried "hugh" and "jhthomas" but finally I found that "thom" was available, so now I can use the phone number 205-677-THOMAS, which is really just THOM but it looks better with my whole last name. Pretty cool.

You get voice mail which can convert your received messages into text and email the text of the messages to the email address of your choosing. I tested this and it works surprisingly well. You can also make outgoing calls to anywhere in the continental US for free. That's right, FREE! You just initiate the call from the website, it calls your phone, and when you pick up it connects you to the other party with no long distance charges.

I can call my phone number from any phone to check messages, to call another phone number, check Google411, or change my settings.

I would recommend getting a Google Voice account as soon as you can even if you just use it like I plan to use it, for connecting with my blog readers.

Wednesday, July 29, 2009


My name is Hugh Thomas and I finally decided to cut my teeth on my first professional blog about SQL Server. I have been developing databases for about fifteen years going back to Access version 2.0. In 2000 I started developing with SQL 2000 and I've never looked back.

As a database engine and feature-rich tool set, I consider the SQL Server platform to be the best bang for your buck and the easiest platform to have multiple tiers of support, from the very basic support user up to the Super DBA.

I have been using SQL 2005 in production since three months after it was released and am currently testing SQL 2008 for future development. Although my primary roll is that of a Database Developer, I have played the roll of our primary (well, really our only) DBA since 2000. That means I have installed SQL 2000-2008 multiple times, dealt with the configuration side of the server and services, and banged my head against the wall when things went South.

I plan to blog about what I learn, mistakes I make, and link to interesting resources that I find along the way. If my successes and failures can help someone else, all the better.

Until next time,