SQL 2008 Express installation errors with 0x84B10001

In my case I was trying to update SQL Server Management Studio (SSMS) from 2005 to 2008. Yea, I know it is 2012 but the 2005 stuff was working. As a general rule I don’t mess with stuff that is working but 2008 SSMS is a simple way to get 2008 SMO for PowerShell. The 2008 SSMS installation complained about the existing 2005 version and asked me to uninstall the 2005 version before proceeding. After I uninstalled the 2005 version I resumed the 2008 installation and it generated this error message.

SQL Server Setup has encountered the following error:

Unable to generate a temporary class (result=1).
error CS0006: Metadata file ‘C:\WINDOWS\assembly\GAC_MSIL\MSClusterLib\1.0.0.0__89845dcd8080cc91\MSClusterLib.dll’ could not be found

Error code 0x84B10001.

Although some folks recommended repairing the SQL installation, my fix was simple. I restarted the 2008 installation and it re-installed the missing items.

SQL Server and Subversion

Two years ago I started supporting a  Classic ASP application that used SQL 2000 for the data base. One of the complaints of the owner was that the previous person supporting the application did not keep track of program changes. The development environment was undocumented. A test system existed  but it was in an unknown state with the production system.  It looks like it was created from a restored backup. There was an old copy of SourceSafe and Visual Studio 2003 but I was pretty sure it was not being used. I could not find any commit logs. They had a source control system(SCS) but they had not used it. Initially I tried to like Sourcesafe and the Visual Studio 2003 environment but in a Classic ASP environment it does not bring a lot to the table. The work flow is slow and not very intuitive compared to Notepad++ and TortoiseSVN. Since this is a one man shop, I had some experience with Subversion, and we were not going to upgrade Sourcesafe, I installed Subversion, Notepad++, and TortoiseSVN. Next I synchronized the code for the production and development systems and made my initial load into Subversion. This was an adequate solution for the ASP, XSL, and XML files.

The data base documentation was not existent, too. So I used SQL Enterprise Manager to script all of the tables, stored procedures, functions and views. I loaded these files into Subversion, too. Within a very short period of time I adopted SQL Management Studio Express(SSMSE) as my testing environment for SQL changes. Here is where I ran into my first SCS conflict. SSMSE scripted the SQL objects slightly differently than SQL Enterprise Manager. I wanted to develop and test SQL changes using SSMSE but I wanted the ability to script all of the SQL objects using SQL Enterprise Manager at any time. I also found the scripts created by SQL Enterprise Manager to create tables were more trustworthy. My kludge solution was to manually apply the changes to the script files created by SQL Enterprise Manager every time I wanted to commit the changes. Using WinMerge this was not difficult but it was an extra step. I yearned for a more elegant solution.

On Friday I think I found it. It was not easy to find but it looks like I can make a significant upgrade to my development environment.  I think I found it on the second or third Google page. The project resides on CodePlex.com and here is it’s description.

DBSourceTools is a GUI utility to help developers bring SQL Server databases under source control. A powerful database scripter, code editor, sql generator, and database versioning tool. Compare Schemas, create diff scripts, edit T-SQL with ease. Better than Management Studio.

Although it lists it’s status as beta, I installed it and scripted my data base without problems. Here are the features that attract me the most.

  1. You can script the entire database including the data. I have not checked out the data scripting yet.
  2. You can edit T-SQL in the same format as you used to script the entire data base if you use DBSourceTools as your editor. This should make it much easier for me to keep the source control system up to date.
  3. If everything works as advertised this should be a relatively easy way to deploy new development systems. I have been promising to deploy a development system with updated table data for over a year. I really like the idea of deploying new systems as a way to verify the integrity of your source control system.

If I can compare schemas and create diff scripts, that’s frosting on the cake. As a SQL development environment it looks very promising and the documentation is remarkably good for a new project, too. Although it is a pain in the butt I started renaming my SQL subversion files(e.g. .PRC to .sql) on Friday. This will take a long time since I am renaming the files via the repo-browser. The SVN client rename is not a rename. It adds the file to the repository and you lose the history.

Top free tools for Windows server administration

Every so often you find a tool you have never heard of.  This week the tool that caught my attention was  Performance Analysis of Logs (PAL). It was recommended by Bruce Mackenzie-Low in a newsletter from SearchWindowsServer.com and it looks it will be helpful with the “art” of performance analysis. I played with it a little bit using the IIS and SQL templates. It seemed to provide some helpful insight into potential performance issues. My aim is to analyze our web server for IIS and database bottlenecks.

Re-engineering an application

Recently I found myself trying to debug an active server page application. It appears to be a simple application. When you go to the page, the server generates a text file which I call a data feed. It is used by search engines to build links to your products. The final step in the process is to download the data feed file and then upload it on the the search engine site. This is such a simple application you could have programmed this in a variety of languages without much effort or concern. The original developer chose to develop the application as an active server page. ASP would not have been my first choice primarily because programming it in SQL is a much simpler solution. In SQL the solution is so simple and straight forward it approaches the holy grail of computer programming, self documenting.

I got involved with re-engineering the ASP application because it was not working anymore. The page was not displaying and their were no error messages. By definition applications are no longer simple if they fail and do not produce an easy to understand error message. I suspected that the error might be related to a “response buffer limit exceeded” issue so I increased the buffer limit. This worked on the development system but it had no effect on the production system. That is not good! Now I was going down a path I did not want to go, fiddling with IIS parameters on a production system trying to fix a problem. Since I am definitely “old school” and evidently SQL centric, I decided to turn this into a batch operation and skip out on the human download/upload process altogether. My plan was to schedule a SQL job to download the data feeds into files using SQL and then use FTP to upload the feeds to their respective search engine sites.

I originally thought I would have this finished this task in a day or two. Boy was I wrong! The combination of ASP, XML, XSL, and SQL stored procedure put the processing in various places and difficult to follow. Of course there wasn’t any program documentation and the original programmer was unavailable. My plan was to combine everything into a SQL view that either BCP or OSQL would use to create a tab delimited file.  Using BCP I can use the ultra-simple “Select *” query on the view.

The first big problem was to create the category field. I needed to recursively lookup the category parent from a table of categories. This was process was originally performed in ASP. After some effort I created a SQL table to mimic the process.

The next problems came in rapid succession. The description field needed the HTML tags removed and some HTML entities needed to be escaped. Then I found that some products were being listed in multiple categories and the category being used by my view was a defunct category.

One of the nice benefits of using the “SQL View” approach was that it was easy to test and verify. I also had a backup plan if the batch process failed for some reason. Although I briefly tried OSQL I found that BCP had a more direct way of creating tab delimited files. Since it only takes a minute and half to create the four feeds, processing requirements are not an issue. Once I had copied the headers to the front of the file I was good to go. I matched the data using WinMerge on the development system since the ASP screen still worked on it.

The data matched and now I am ready to submit the files. This minor re-engineering took a lot more time than I planned but I think the process if very to explain.

The next problems were more annoying. There were permission problems with running BCP. Yahoo created FTP problems for me. They allow you to update files using FTP but your FTP client better support PASV. I was able to upload the file using FileZilla but not Microsoft FTP. I am searching for a command line FTP client I can use. I think MOVEit Freely from Ipswitch might be the answer. Ipswitch is probably best known for WS_FTP. A few years back WS_FTP was the standard bearer for FTP clients and servers.

Finally I am not sure what happened to MSN’s product upload page, http://productupload.live.com. Suffice to say it has had major problems every time I tried to use it. At this time I am not sure MSN wants me to update the data feeds using FTP. It is too bad they are so difficult to use. Most of our traffic comes from Google and Yahoo. Not surprisingly they get the bulk of our advertising expenses. MSN has always been a distant third place.