In all of my years of being a system administrator I have never seen a machine as infected as I saw today at my sister-in-law’s house. When we turned on the machine, the fist sign of problems was that it did not go to the normal home page. When I tried to do a search the links on the results page did not work. Naturally my sister-in-law did not know what happened. Since the MacAfee software had expired and my sister-in-law was fussing about the cost of virus checking software, I downloaded a current version of Microsoft Essentials. I had to reboot the machine into safe mode with networking support before I could download the current virus and spyware definitions. Microsoft Essentials found a multitude of Trojans and worms. I cleaned the computer and rebooted several times. Finally I decided to perform a full scan before clicking on the “Clean†button. The full scan took a long time but the computer is now working as expected.
Adding IP Restrictions to IIS 6
A big thanks goes out to the obligatorymoniker and his script for programmatically adding IP restrictions to IIS6. I was looking for a better script to add IP restrictions. My previous script added the restrictions one IP range at a time. This script was adequate for a small number of IP restrictions but recently I was asked to add IP restrictions for every country we do not ship to. We had credit card fraud transaction from one of these countries and the boss was mad. After using Perl and a CIDR to merge the adjacent networks, I still had over 18,000 IP ranges to deny. Using my old script I tried to add these IP ranges to our test system this took over an hour to load. Your script loads the ranges in a couple of seconds.
A big thanks goes out to the obligatorymoniker and his script for programmatically adding IP restrictions to IIS6. I was looking for a better script to add IP restrictions. My previous script added the restrictions one IP range at a time. This script was adequate for a small number of IP restrictions but recently I was asked to add IP restrictions for every country we do not ship to. We had credit card fraud transaction from one of these countries and the boss was mad. After using Perl and a CIDR to merge the adjacent networks, I still had over 18,000 IP ranges to deny. Using my old script I tried to add these IP ranges to our test system this took over an hour to load. Your script loads the ranges in a couple of seconds. Here is how I did this:
- I went to http://www.countryipblocks.net/ to get the IP ranges I wanted to block. Beware these ranges include bogon networks(e.g. 192.168.0.0). The first time I applied the IP ranges I blocked myself out.
- I used the perl script below to merge the networks.
- I used the obligatorymoniker IP Security.vbs script to load the ranges. You will have to change the
"IIS://localhost/smtpsvc/1"
to the site you want to add the IP restrictions to.
use Net::CIDR::Lite; use NetAddr::IP::Lite; my $cidr = Net::CIDR::Lite->new; # Disallow IPs open (IPDISALLOW, "ip_disallow.txt") || die "couldn't open the file!"; while ($record = <IPDISALLOW>) { if (substr($record,0,1) != '#'){ #print $record; $cidr->add($record); } } close(IPDISALLOW); #print "$_\n" for $cidr->list; foreach ($cidr->list) { my $ip = new NetAddr::IP::Lite $_; #print "The address is ", $ip->addr, " with mask ", $ip->mask, "\n" ; print $ip->addr, ",", $ip->mask,"$_\n"; }
Adventures with iRedMail – Part III
Recently I installed iRedmail at work so that we could include DKIM signatures in our newsletters. Every week we send out a newsletter to 96,000 former customers. It takes about 13 hours to send the newsletter. Yahoo is probably our most important email domain and they want us to implement DKIM. A couple of weeks ago we started seeing Yahoo limit our sending rate. Obviously they had a problem with something in our newsletter. So we re-analyzed the error codes we were getting during the newsletter mailing and implemented DKIM. The problem is fixed. Here is how I implemented this version of iRedMail.
I implemented a VMware version of iRedMail to sign newsletter emails using DKIM. I used Ubuntu 9 server version(optimized for VMware version) to build appliance.
- The server works as a mail proxy in front of the SMTP server we use exclusively for the newsletter. It signs and relays the email to the existing SMTP server. I kept the existing SMTP server so that I could continue to use my existing procedures for parsing the log files to identify old/obsolete mailboxes.
- I created iRedMail users in LDAP to relay local users to mailboxes on Exchange.
- My primary bottleneck is still my mail transmission to the Internet speed, 2 per second. I can create newsletter emails at about 8 per second.
- On an old Proliant DL350 G4 iRedMail consumes about 40% of the dual CPU computer for four hours.
Since I had experience installing iRedMail it went quickly. The biggest bug I had to fix was the AWStats/permissions problem on the mail.log file.
Parsing log files with Powershell
I found myself wanting to parsing a log file to find out which domain was getting the most newsletters. There are a variety of ways you can do this. Typically I would use Excel but there were more than 65K lines to import so I had to use something different. For kicks I did it in Powershell and here is how I did it in three lines. The log file is a tab delimited file without a header line. The field we are going to count is called, “RecipientDomainâ€.
$header = "ServerFQDN","ServerDomain","IPAddress","MailTime","ClientDomain","RecipientDomain","Sender","Recipient","MessageID","Status","User","Size","ClientFQDN" $a = import-csv smtp-201007060000.csv -delimiter `t -header $header | where-object {($_.Status -eq "RECV=OK") } $a | group-object -property RecipientDomain -noelement | sort -property @{Expression="Count";Descending=$true}, @{Expression="Name";Descending=$false} | Select-object -First 20
10 Laws of Productivity :: Tips :: The 99 Percent
Here are some great tips on improving your productivity. I included the highlights below. Read the full article to get the full explanation and some explanations.
10 Laws of Productivity :: Tips :: The 99 Percent
1. Break the seal of hesitation.
2. Start small.
3. Protoype, prototype, prototype.
4. Create simple objectives for projects, and revisit them regularly.
5. Work on your project a little bit each day.
6. Develop a routine.
7. Break big, long-term projects into smaller chunks or “phases.â€
8. Prune away superfluous meetings (and their attendees).
9. Practice saying “No.â€
10. Remember that rules – even productivity rules – are made to be broken.
Importing Self-signed CA Certificate into Windows 7
Yesterday I opted to create self-signed certificates for my local servers. Most of my local servers already had self-signed certificates with default names so it looked like a simple task. I found this document, Creating Certificate Authorities and self-signed SSL certificates, and in a few minutes I created a new Certificate Authority and replaced my existing server certificate. I checked the site via my web browser and it complained about needing the Certificate Authority certificate. So I copied the CA certificate to my PC and imported it into the Trusted Root Certification Authorities using IE8. Despite a message saying it succeeded, it really didn’t import the certificate. I re-started the browser and re-started the computer but the certificate refused to show up. I finally opted to login as the Administrator and install the certificate to the Trusted Root Certification Authorities of the computer account. I suspect that the key requirement is to import into the computer account. For those unfamiliar with the process you open a command window and run “mmcâ€. Next you click File-Add Snapin and add the Certificates Snap-in. When you add the plug-in it will prompt you to select which account you want to update. Select the computer account, the local system, and click to add the Snapin. Now navigate to Trusted Root Certification Authorities and import the CA certificate.
SQL Server and Subversion
Two years ago I started supporting a Classic ASP application that used SQL 2000 for the data base. One of the complaints of the owner was that the previous person supporting the application did not keep track of program changes. The development environment was undocumented. A test system existed but it was in an unknown state with the production system. It looks like it was created from a restored backup. There was an old copy of SourceSafe and Visual Studio 2003 but I was pretty sure it was not being used. I could not find any commit logs. They had a source control system(SCS) but they had not used it. Initially I tried to like Sourcesafe and the Visual Studio 2003 environment but in a Classic ASP environment it does not bring a lot to the table. The work flow is slow and not very intuitive compared to Notepad++ and TortoiseSVN. Since this is a one man shop, I had some experience with Subversion, and we were not going to upgrade Sourcesafe, I installed Subversion, Notepad++, and TortoiseSVN. Next I synchronized the code for the production and development systems and made my initial load into Subversion. This was an adequate solution for the ASP, XSL, and XML files.
The data base documentation was not existent, too. So I used SQL Enterprise Manager to script all of the tables, stored procedures, functions and views. I loaded these files into Subversion, too. Within a very short period of time I adopted SQL Management Studio Express(SSMSE) as my testing environment for SQL changes. Here is where I ran into my first SCS conflict. SSMSE scripted the SQL objects slightly differently than SQL Enterprise Manager. I wanted to develop and test SQL changes using SSMSE but I wanted the ability to script all of the SQL objects using SQL Enterprise Manager at any time. I also found the scripts created by SQL Enterprise Manager to create tables were more trustworthy. My kludge solution was to manually apply the changes to the script files created by SQL Enterprise Manager every time I wanted to commit the changes. Using WinMerge this was not difficult but it was an extra step. I yearned for a more elegant solution.
On Friday I think I found it. It was not easy to find but it looks like I can make a significant upgrade to my development environment. I think I found it on the second or third Google page. The project resides on CodePlex.com and here is it’s description.
DBSourceTools is a GUI utility to help developers bring SQL Server databases under source control. A powerful database scripter, code editor, sql generator, and database versioning tool. Compare Schemas, create diff scripts, edit T-SQL with ease. Better than Management Studio.
Although it lists it’s status as beta, I installed it and scripted my data base without problems. Here are the features that attract me the most.
- You can script the entire database including the data. I have not checked out the data scripting yet.
- You can edit T-SQL in the same format as you used to script the entire data base if you use DBSourceTools as your editor. This should make it much easier for me to keep the source control system up to date.
- If everything works as advertised this should be a relatively easy way to deploy new development systems. I have been promising to deploy a development system with updated table data for over a year. I really like the idea of deploying new systems as a way to verify the integrity of your source control system.
If I can compare schemas and create diff scripts, that’s frosting on the cake. As a SQL development environment it looks very promising and the documentation is remarkably good for a new project, too. Although it is a pain in the butt I started renaming my SQL subversion files(e.g. .PRC to .sql) on Friday. This will take a long time since I am renaming the files via the repo-browser. The SVN client rename is not a rename. It adds the file to the repository and you lose the history.
Add a partition to Openfiler
I keep suffering from memory loss when it comes to using Openfiler. I use it so infrequently I keep forgetting how to add a partition to Openfiler. The user interface is not very intuitive so I keep recreating my steps. So I am posting this procedure as a reminder.
- Click on the link, Volumes (https://filer:446/admin/volumes.html), in the navigation menu at the top of the page.
- Click on the link, Block Devices (https://filer:446/admin/volumes_physical.html), in the navigation menu on the right side of the page.
- To add a partition to the device, /dev/sda, click on the link, /dev/sda, under Edit Disk column.
- At the bottom of the next page, Volumes : Block Devices : Edit Partitions, enter the data for the partition and click on the Create button.
Event ID 7024 on SBS 2003 computer
If you get “The Certificate Service terminated with service-specific error 2148204801(0x800B0101)†you need to renew the certificate the certificate authority for your domain. If you are renewing a certificate for a self-signed domain, you can follow the procedure below. In my case the certificate is valid for 5 years.
- Go to Admin tools > Certification Authority.
- Highlight your server and right click. Then select All Tasks > Renew CA Certificate.
- If everything works, you should be able to start the certificate service. Highlight your server and right click. Then select All Tasks > Start Service.
w3wp.exe high cpu usage thread
Yesterday I learned an important lesson about IIS logs. They do not show you all of the requests hitting your server! Evidently the log does not show canceled requests.
Over the last couple of days I was receiving complaints about slow responses from our web site. By the time I would look at the CPU utilization it would be within the normal range. I looked in the IIS log file for timeouts but could not find any. So I ran a two hour trace on Thursday afternoon. Friday morning I crunched the trace with PAL and discovered several unaccountable CPU peaks attributable to w3wp.exe. An Internet search for “w3wp.exe high cpu usage†led me to this thread in which several people recommended using IISPeek to find the misbehaving request. So I installed a trial version of IISPeek and started watching the transactions coming in. Pretty soon I saw something I was not expecting. A shopping site was coming to our site and trying to retrieve a product advertising feed. What was surprising was not that the shopping site was retrieving the feed but there was no log of it in the IIS log file. This request had been consuming our CPU for several minutes and then disappearing without a trace. I knew this query had some serious performance issues but the IIS log indicated that it had been working on previous days. I did not know it was running so often. With IISPeek and Task Manager running together I could see the impact on the site. Evidently this particular shopping site would time out or cancel the query before our site either returned the data or timed out. It was at this moment that I figured out that IIS must have a “no harm, no foul†policy about canceled queries. My reliance on the IIS log was a mistake in this case. Since this shopping site was not getting the data, the shopping site would try again at a later time. When I was watching it with IISPeek I was seeing this request about every fifteen minutes. Fortunately I had already developed some web page caching code I could implement quickly and get us over this hump. Within an hour the shopping site had its data and our web site was back to normal. I have solved a lot of web site problems by looking at the IIS log but in this case it was not the right tool for the job. On Monday I am buying a copy of IISPeek!
Cleaning up an existing newsletter mailing list
In December 2008 I was asked to clean up some problems with our newsletter at work. We had over 100,000 people on the mailing list and over 90% of the people on the list have ordered from our website. After a little bit of analysis I determined that we were bouncing 30% of our newsletter emails because we had failed to follow the most basic rules of newsletter management and the automated newsletter cleanup procedures did not work. So here is my list of tasks I used to cleanup the newsletter and get the bounce rate down to 0.1%. In our case we are sending the emails out from a dedicated server at our office.
- Use a static IP to send out the newsletter. One of the first SPAM checks email providers use is to see whether the IP you are using to send out the newsletter is coming from your domain. This means that you need to set up a sub-domain(e.g. mailserver.mycompany.com) and a PTR record for the sub-domain.
- Set up a sub-domain for the static IP. Since our web server is hosted we had to ask our host provider to set up the sub-domain.
- Set up a PTR record for reverse DNS lookup. I asked the folks who provided us our static IP to set up the PTR record to the sub-domain.
- If everything is set up correctly you should be able to pass the reverse DNS lookup test. This is the site, Forward Confirmed Reverse DNS Lookup Test, I used to confirm it was working properly.
- Set up feedback loops if you can. Feedback loops are pretty dumb idea that email providers like. I dislike them since I have only feedback loop that is useful. I am grateful that AOL has made it reasonably easy for me to remove people who do not want to be on our mailing list. I was able to quickly modify our existing newsletter template to embed an unsubscribe link that would make it through the feedback loop processing. On the other hand I found I wasted a huge amount of time trying to set up a feedback loop with Yahoo and Hotmail. Both Yahoo and Gmail want you to sign your emails with DKIM or they will not talk to you. DKIM was supposed to reduce SPAM but I have not seen any reports showing it reducing SPAM. Implementing DKIM will require me to set up a new email server for the newsletter so it is pretty far down my priority list. So far I have been able to ignore this issue. Hotmail wanted me to get a letter from our local internet provider saying we were the only folks using the static IP. Our local internet provider, RoadRunner, told me several times that no customer had ever requested a letter like that and they were not going to provide it. I set up a feedback loop with Comcast but after a couple of months they increased the amount of information they redacted from the email and broke the unsubscribe loop in the feedback loop message.
- Embed an unsubscribe link in your newsletter template that will unsubscribe the user but does not require the email address. Since most feedback loops redact the email address, this will allow you click on the link in the Feedback loop message to unsubscribe the user.
- Go through the error log on your email server and look for the messages that indicate that email address is inactive or no longer used. Unfortunately there is a multitude of messages that are used to describe unknown users(5.1.1, 5.5.1, unknown user, alias not found). This is one area that begs for a standard. This is where the feedback loop should have been.
- Manually go through your newsletter inbox and look for:
- Earthlink, PeoplePC, Zonealert, and other verification replies
- Unknown user messages.
- Feedback loop messages
- Changed addresses and unsubscribe messages.
- Other replies.
- Mailbox full, Out of office replies
- Customer service requests. About once a week we get a reply to the newsletter that asks a question about a product.
- Miscellaneous SMTP problems(e.g. DNS and email forwarding problems)
How to remove the Windows.old folder that is generated when you install Windows 7
I am not sure how I got this 2.2 GB folder on my “C†partition but it was not necessary. With free space on my “C†partition down to 5% it was time to clean house. The Vista instructions will work as written if you run the Disk Cleanup utility as the Administrator. If you happen to run the utility as a “mere mortal†there is a button in the Windows 7 version to restart the utility as an Administrator if you want to “Clean up system filesâ€.
Jeditable and Classic ASP
This week I implemented a grid style application using Classic ASP and Jeditable. The hardest part of implementing this application was trying to figure out what a save.asp version of save.php would look like. Here the template I created.
<% dim sID, sValue,errorcode dim field1,field2,field3,field4 'The sID is a spreadsheet style ID 'As an example B3 would be the second editable field 'for DB ID field = 3 sID = request("id") sValue = request("value") sType = mid(sID,1,1) sDataID = mid(sID,2) 'We have four editable fields 'The changed field will not be null field1 = null field2 = null field3 = null field4 = null errorcode = 0 select case sType case "A" field1 = sValue case "B" field2 = sValue case "C" field3 = sValue case "D" field4 = sValue case else errorcode = 1 end select if errorcode = 0 then 'Validate and update the data base end if if errorcode = 0 then 'Send back the value field Response.Write sValue else Response.Write "<b>!Error " & errorcode & "</b>" end if %>
How To Set Up A Terminal Server In Linux Using Ubuntu 9.10 And FreeNX
This article was timely. I had just installed virtual version of Ubuntu on my ESXi server and set up VNC so I could access it. It was okay but FreeNX is a more elegant solution. The combination of FreeNX and Firehol to setup the firewall makes it a winner in my book.
How To Set Up A Terminal Server In Linux Using Ubuntu 9.10 And FreeNX
FreeNX is an open source implementation of NoMachine’s NX Server. It is a bit more akin to Microsoft’s RDP protocol that the usual VNC, so while keeping bandwidth to a minimum, it maintains good visual quality and responsiveness.
How To Set Up A Terminal Server In Linux Using Ubuntu 9.10 And FreeNX
(author unknown)
Mon, 25 Jan 2010 16:42:09 GMT
Windows 7 Upgrade from Windows XP Home
I think I can finally say that I have finished the upgrade, Free Cell is installed ;). This summer I installed Windows 7 RC and was pleased with the performance and the look and feel on my 3 year old laptop. It would have been nice if I could have just upgraded the RC version but I was going the Professional version rather than to the Ultimate version. Since I had previous partitioned my disk and cleaned up the disk space I was in pretty good shape for a clean install. The hardest part of the install was install the device drivers for the old QMS printer and Epson scanner. Support for these devices was not included in Windows 7 so I had to install the old XP drivers using the XP compatibility mode.
Notes on Installing the Network Monitoring Appliance
A couple of weeks ago I installed the Network Monitoring Appliance using the tutorial on HowToForge.com. Prior to installing the Network Monitoring Appliance I was planning to give the latest community version of GroundWork Monitor, http://www.groundworkopensource.com/products/community-edition/index.html another trial. My network monitoring objectives were to have the Network Monitoring appliance notify me of problems on a remote web server and on my local network. Although these network monitoring objectives can be accomplished by a ping or a “HTTP pingâ€, I wanted to see a some network throughput graphs and I expected to eventually need a slightly more sophisticated data base monitoring in the near future. Nagios was at the core of the best solution for me since accomplished most of my needs and I was already familiar with Nagios from a previous trial of Groundwork Monitor. The primary attraction of the Network Monitoring Appliance over Groundwork was its much smaller resource requirements. In my environment it would be sharing a VMware ESXi server. I was also pleased to see that the Network Appliance used Jeos. For those unfamiliar with Jeos it is:
Ubuntu Server Edition JeOS (pronounced "Juice") is an efficient variant of our server operating system, configured specifically for virtual appliances.
Users deploying virtual appliances built on top of JeOS will benefit from:
- better performance on the same hardware compared to a full non-optimized OS
- smaller footprint of the virtual appliance on their valuable disk space
- fewer updates and therefore less maintenance than a full server installation
For my installation I decided to use VMware’s 32-bit Ubuntu template to create the virtual machine. The only modification to the template was to adjust the disk drive size down from 8 GB to 1 GB. As described in HowToForge tutorial I installed the following programs.
- Ubuntu 8.04.3 JeOS as OS
- Nagios 2.11 for monitoring and alarming
- Smokeping 2.3 to observe latencies and packet loss
- MRTG 2.14.7 to observe network traffic’s tendencies
- RRDTool 1.2.19 as the Round-Robin Database for storing all measurement data
- Lighttpd 1.4.19 as a fast, lightweight web server frontend
- Weathermap4rrd for illustrating the network weather
- sSMTP as extremely lightweight MTA for mail delivery
The installation was quick. Almost all of my challenges was in configuring the programs. Fortunately I had previous experience configuring the most difficult to configure programs, Nagios and MRTG. It helps if you have a basic knowledge of PERL since most of programs use it. Here are my installation notes.
- One of the first things I needed to install to make this installation go smoother was an editor other than VIM so I could cut-and-paste from the tutorial to my SSH session. In my case I installed nano.
- The first application I configured was smokeping. The configuraton file is pretty easy to figure out and can be found at /etc/smokeping/config. If everything works you will see a nice graph of the the ping statistics at http://yourip/cgi-bin/smokeping.cgi.
- Configuring Nagios is a bit more complicated. Since this is version 2 of Nagios, the configuration files are located at /etc/nagios2/conf.d. The main Nagios web page is at http://yourip/nagios2/. The Nagios QuickStart Document, http://nagios.sourceforge.net/docs/3_0/quickstart.html, is a good primer for the folks not familiar with Nagios.
- The Debian logo did not appear in Nagios next to the localhost. It showed a missing image. After a little research I figured out that I needed to install nagios-images using apt-get install nagios-images.
- For some reason I did not seem to have cron installed and running. This is easily solved by apt-get install cron.
- MRTG is useful if you have a SNMP router to poll. I used my pfSense Firewall as the SNMP source. MRTG provides some nice graphs of network traffic and its page is located at http://yourip/cgi-bin/mrtg-rrd.cgi/
- Configuring Weathermap4rrd is a little challenging since the documentation is sparse. Weathermap4rrd provides a clever network status graph once you figure how to configure it. It uses the same data as MRTG to create its graph. The network status page for weathermap4rrd is located at http://yourip/weathermap4rrd/weathermap.png
- I installed apticron to nag me via email about installing security updates and Logwatch to find any problems posted in the log file by the installed programs.
- If you plan on getting emails from Nagios when a host is down, you should test it. Duh! The easiest way to test it is to deliberately mistype the host name. If you do not get the email, you should check your Nagios configuration, sSMTP configuration, and the SMTP log file.
- sSMTP is easy to configure and use. In the simplest configuration you point it at the SMTP server you are sending your emails to. If you are sending emails to more than one domain, you need to connect to a SMTP server that will relay emails for you.
- I installed PHP version 5 to see how hard it would be to install under Lighttpd. I followed the instructions on the Lighttpd wiki and PHP appears to be running without problems. Most of these network monitoring programs have newer versions in PHP. Some day in the future I plan to migrate to the PHP versions of Nagios and weathermap but it is not necessary for this small network.
- I created a simple navigational menu on the main page with links to the various network management status pages. It is much easier to use this menu then remembering the addresses of the different status pages.
Updated Script for emailing ntbackup log files
Jason left a comment on a previous post about wanting to see the script I am using to email NTBackup log files. Recently I converted the script to powershell from vbs. Here is the old file.
#************************************************** # Script Name: Ntbackup_E-Mail_Alert # Version: 1.0 # Author: Bill Huber #Last Updated: 19.Nov.2009 # # Purpose: Concatenates two or more log files into the body of an email message. I schedule # this script to run at a time the backup job should be finished and to send me # the latest NTBackup log files as an email with a somewhat informative subject field. # # Legal: Public Domain. Modify and redistribute freely. No rights reserved. # SCRIPT PROVIDED "AS IS" WITHOUT WARRANTIES OR GUARANTEES OF ANY KIND. # USE AT YOUR OWN RISK. NO TECHNICAL SUPPORT PROVIDED. #************************************************** # Customize the following variables for your SMTP server, email from address, # email address the message is going to, the minimum log size, and the log path. $SmtpServer = "mySBServer" $From = "mySBServer Administrator <administrator @myCompany.com>" $To = "billhuber@myCompany.com" $intLogSize = 1000 #If the log file is less than this size, the backup probably failed #The following variable point to the log file location $logpath = "C:\Documents and Settings\Administrator\Local Settings\Application Data\Microsoft\Windows NT\NTBackup\data\*.log" # End of Customization $SmtpClient = new-object system.net.mail.smtpClient $SmtpClient.host = $SmtpServer #Get the filenames and other stuff for the last two log files #We are going to concatenate the last two log files into the body of the email message $a = get-childitem $logpath | sort-object lastwritetime | select-object -last 2 #Get the last write time for the last report for the subject line $b = $a | sort-object lastwritetime -descending | select-Object -first 1 $c = $b.LastWriteTime $d = $b.length $Title = "SUCCESS - Full Backup at $c" if ($d -lt $intLogSize){ $Title = "ERROR - Full Backup Failed at $c" } $Body = "" foreach ($line in Get-Content $a) { $Body += "$line `n" } $SmtpClient.Send($from,$to,$title,$Body)
Q-Dir – Multi-Pane File Manager :: the How-To Geek
I found when I was updating our web site I would run a VB script to open three Explorer windows. I have chosen to use Q-Dir to replace the script since it actually does a better job with screen real estate and allows me to open four windows. I use the the portable version to avoid installation headaches.
Sometimes when looking through a file manager, it would be nice to have more than a dual-pane view. Now you can manage your files with up to four viewing panes at once with Q-Dir.
Note: Q-Dir is available in regular install and portable versions.
Dumping raw XML using ASP
When working on other people’s code it sometimes difficult to figure where the data is coming from and you really don’t have the time to spend figuring it out. We were having a problem with an ASP page that was blowing up when a certain XML field was empty so I wanted to simple command to dump the raw XML. Either the XML field had a different name or it wasn’t in the XML file. I knew the command must exist but it was surprising difficult to find. Here is what I used to dump an XML object called xmlData.
response.write xmlData.documentElement.xml
OBTW the element was not in the XML file.
Quick Takes: python(x,y) – Python for Scientists
Python(x,y) is a free scientific and engineering development software for numerical computations, data analysis and data visualization based on Python programming language, Qt graphical user interfaces (and development framework) and Eclipse integrated development environment.
Although I would say I am conversant in Python and can see why a lot of people like it, it is not necessary for any of my job functions. In fact I recently converted the only python program used at work over to PowerShell. It was a trivial program that has been written a million times in a multitude of scripting languages. In this case it had a bug so it was a fairly trivial exercise to convert it over to Microsoft’s favorite scripting language.
Scarcely could I imagine that I would be seriously playing with python just a couple of weeks later. The trigger for this event was a blog post on SQLServerCentral.com called Python for the SQL Server DBA. In the article I was intrigued when the author said he used Python(x,y). I had not heard of it so I checked out the web site, python(x,y) – Python for Scientists, and decided to convert an Excel spreadsheet graph over to python. The graph is a fairly standard multiple line plot of time data. This is the type of graph you can create in Excel in about five minutes.
It took a lot longer to create the graph in python but I am not disappointed. Much of my time was spent learning how to manipulate Matplotlib to achieve the desired graph. Matplotlib is a library for making 2D plots of arrays in Python and looks a lot like MATLABâ„¢ . Since my knowledge of Matlab was nil, I had a lot of catching up. The flexibility of Matplotlib to customize a graph reminded me a lot of SAS/GRAPH. That is both the good and bad news. Although Excel has a lot of graphing options and I recommend it for most graphing requests, there is always some option it does not do quite right. Matplotlib overcomes those problems with lots of customization options and can be used to create some pretty exotic graphs. The bad news is there is a significant learning curve in understanding how to use those options.
Almost of all of my development for this simple graph program was done in IPython although more interactive environments like Eclipse and Spyder were available. In hindsight I would probably prefer Spyder to develop my next program. Most of my work is not very sophisticated and the lightweight integrated IDE of Spyder appealed to me more than Eclipse. Eclipse is still relatively slow at starting up. When I look at the whole python(x,y) download, the greatest contribution is the breadth of the products included in its download. You can start your work from the command line for simple programs like I did and progress all the way up to fairly comprehensive graphical user interface using QT and Eclipse for sophisticated programs. The python development has come a long way.