Category Archives: Tools

Norton Ghost

I finally got Norton Ghost and after hours of googling, gave up on the idea of backing up my 160GB drive which is a slave drive.

I hate using my blog for tech support, but using Ghost 9.0, is there a way to back up a slave drive in a way that I could extract files if needed? You can do that with the backup image of the primary drive.

My slave drive is partitioned into 3 drives. It did seem to find the first partition of the slave drive, called it “unknown drive”, let me back it up, but did not allow me to open up and view the contents of the backup image. I don’t envision myself restoring a whole drive as much as I’d want to restore a file here or there.

So I’ll be leaving for my long drive a little late since copy & paste takes a while with 30 GB.



Posted from BLInk!

Temp Tables in SQL 2000

I had a hard time remembering how to do this and was not using the right keywords in Google.

There are a few ways to do temp tables in SQL. In SQL7 we could use a #tablename, but there are issues with that. It becomes part of the returned resultset and you have to remember to delete them.

In SQL2000, you can still use that syntax but in many cases a table variable is a much better route. It is a variable declared in your stored procedure, it is local so you don’t have to worry about deleting it and it does not affect the result set.

NOTE: You must include “SET NOCOUNT ON“ at the beginning of your stored procedure to prevent that extra gunk in the result set that ADO retrieves. I found that this works with table variables, but it does not fix the problem with #temptables.

Here’s a useful article on it: http://www.sqlteam.com/item.asp?ItemID=9454 with syntax and caveats.

I had a sproc using the #temptable method that worked just fine here on my test server which is SQL2000. When I deployed it to my client’s production server, also SQL2000 with latest service packs, ADO (in VB6 , not ado.net) was having an issue against the production database with recordset.next that did not show up when I ran against the local database. So I went with the table variable instead.



Posted from BLInk!

Designing Crystal reports – from a database vs. from ADO.NET

One of the advances of Crystal in version 8 was that we had the ability to create TTX files from our recordsets and then use the TTX files to design reports from. This alleviated a huge headache of being forced to either use that mistake of a database environment or hooking directly to databases to create reports. Building these dependencies between the report and one of those resources meant confusing memory problems when you ran the report. The TTX file was just a little text file that contained the schema of your recordset. Once the report was designed, you didn’t even need to tote the ttx file around anymore.

In .NET, this evolved to the ability for the report to be designed from an ado.net dataset. You do this by creating a dataset object in your project and linking to that when you design the report. I generally create the object by doing a dataset.writexmlschema in my code initially. Once that file is created, I comment out the line and bring the schema into my project and generate a dataset object from it. (Did you know you can do that in vs2003 by right clicking on the xsd and selecting the mysterious “run custom tool”?)

Many people are still unaware of this feature and do what comes naturally, which is to point directly to the database for desiging a report. This is one of the causes of the dreadful “please logon” message we get at runtime when trying to view a report. (Another one, I discovered was passing a dataview to the report’s datasource instead of a datatable and apparently passing a dataset can do this too.)

Here is a PDF from the Crystal Reports .NET DevCenter on the Business Objects site that explains how to build reports from ADO.NET.

Also, here is an article by Susan Harkins that I found for someone that explains how to create an XSD file from Access.



Posted from BLInk!

RequestCachePolicy in .NET 2.0

Two of the new classes inside of System.NET.Cache are the HTTPRequestCachePolicy and RequestCachePolicy. It gives us some control on client side caching on a request by request basis. Since I think “web apps” when I think about caching, it took me some time to grok (with some help from a few folks at MS) what it’s real purpose is.

Of course, it doesn’t make sense that a webserver based app would have ultimate control over client side caching (wouldn’t that be cool though – magic, but cool!) As far as I know, only the client machine has knowledge of what is in its’ cache.

These classes are available for client and middle tier applications that are making HTTPWebRequests or FTPWebRequests and allow you to control how a webrequest is handled based on either the age of what is available in the cache or just by the location.

Since I do most of my interaction with a webserver via web services* or aspx pages, this doesn’t solve any problems that I am currently having, but it has given me some ideas of how I might leverage it in my applications.

Here are some ideas.

Say you have a web application that creates a pdf report and returns it to the calling client – maybe that is a windows app or even a web app that is the client. The data for this report does not change very much during the course of the day and it is a huge report. So you don’t realy want users taxing the webserver by requesting it more than once per day. You can tell the calling application to first check the local cache and if that download is in there and it is less than one day old, then it will not bother making the request.

Another example is for windows apps that actually use the web browser control. I have never had occasion to do this myself. But you can control how calls are made to the webserver to populate that control with instructions similar to the above sample – based on age or freshness – or you can instruct it to always go from the local cache under certain situations or always from the server. Or you can tell it that if there is nothing in the cache than get it from the server, but otherwise always get it from the cache. My simple demo on this only showed interaction with this funcationality. I had a web page that displayed the current time and you could see, for example, the time not changing for 5 seconds even though I was doing rapid-fire refreshes on the page.

This is not earth shattering functionality for the work that I do, but I can imagine that people writing applications for large corporations would benefit greatly from every tool that reduces the toll on the webserver. You can read more in this online documentation for .NET 2.0: Cache Management for Network Applications.

*re: web services – I REALLY wanted this to work with web services, but it just doesn’t. Although you can pass it the uri of an asmx file, there is no way to call the web method. I tried setting up the definition of the policy against the entire web service and then making a call to the webmethod, but it just didn’t work.

Posted from BLInk!

Debugger Visualizers in latest bits

Although I was expecting some syntax changes, so far, my debugger visualizer that I wrote with the March bits and ran with the Beta1 still runs in the October bits.

Two things to note: if you have the debugger visualizer project open in the same solution as the app you want to use it in, you will get a wierd error when trying to use that visualizer in run time. I actually haven’t dug into how to debug these things. They are not like a project that you reference and debug into. But if you are testing it, you would just open it up in a separate instance of VS, then compile and copy the dll into the directory and test again.

The other thing is that when you create your visualizer and, as per the Beta1 bits, have to put them into mydocuments\visual studio\visualizers folder. In these bits, the folder does not pre-exist. However if you use one of the canned visualizers, then the folder gets created automatically.

Here is another pointer if you are moving a d.v. from older bits to newer bits.



Posted from BLInk!

Just for fun

Just for fun, I’m going to put the latest CTP of Visual Studio on my laptop that I will be using for my What’s New in BCL Whidbey for ASP.NET developers presentation at next week’s ASPConnections. Maybe I’ll even try to run SQL 2000 Developer Ed. and sql 2005 express side by side. Just have to watch out for that sqldmo problem (though maybe it’s fixed now) 

Posted from BLInk!