Recovering Hard Deleted Items in Outlook

This isn’t new information, but it’s something that comes up from time to time – recovering hard deleted (SHIFT+DEL) items from Outlook. Hard deleted items skip over the “Deleted Items” bin, so they can’t be recovered using the regular “recover deleted items” tool within the Outlook client.

Exchange 2003 OWA can be used to recover items that were hard deleted using the Outlook client. To get back those items, log into the OWA web page. Then edit the URL to be: “https://server_name/exchange/user_name/inbox/?cmd=showdeleted“. The “dumpster” for the inbox will appear and you can recover your deleted email. If you want to recover items from other folders, just change the word “inbox” in the URL to the folder you need, like “calendar” or “drafts”.

If you are using Outlook 2003 as your mail client there is a registry setting that you can add to turn the dumpster on for all the folders. Outlook 2007 has the registry setting already enabled by default. Of course, recovering any deleted items assumes that the deleted items retention settings have been configured on your Exchange server.

Installing IIS for SQL 2005 and SharePoint

I’ve started planning out an installation of SharePoint at work and have found myself installing some of the necessary WSS 3.0 components in the lab. I want to set up SharePoint as a small server farm on one server, which requires SQL to be pre-installed. Both SQL 2005 (if you want all the services) and WSS 3.0 require IIS, but the default installation of IIS on Windows Server 2008 does not include all the necessary components for either one.

First to support WSS 3.o, you’ll need to make sure all the components in this list are selected. But if you go with the just components on that list, you’ll still get a warning about “IIS Feature Requirement” when installing SQL 2005. Most of the necessary components overlap with WSS 3.0 except for one – HTTP Redirection – so be sure to select that one as well.

Finally, if you are looking around for some WSS 3.0 installation guides, here is a link to some of the downloadable documentation. Perfect if you are looking for some fresh reading on your Kindle.

Keeping track of the SQL User Provisioning Tool

Here’s a tool I find myself looking for over and over again. After installing SP2 or SP3 on an installation of SQL 2005 you have the option to run the User Provisioning Tool for Vista”, allowing you to set proper access permission.

However if you haven’t restarted SQL services before running it, it fails to connect to the database and then closes. There isn’t a shortcut anywhere to restart it, so it can be a mystery as to how to locate it again. The path to the tool is:

%ProgramFiles%\Microsoft SQL Server\90\Shared\sqlprov.exe

For more information about why this tool exists, check out this msdn blog post.

Who’s Geeky? She is.

Happened across the She’s Geeky conference while surfing around the web. “She’s Geeky” is an event specifically for women interested in and/or working in the technology, math and science industries. Actually, it’s an “un”conference – 3 days of geek-minded women gathered together with a daily agenda of tracks and sessions generated fresh every morning.

I’m always up for an interesting tech conference, plus it’s hard to pass up an event being held at the Computer History Museum in Mountain View, CA. Seems like a great chance to check out the Babbage Engine, too!

Handy ASP.NET 2.0 Tidbit

I’ve been familiarizing myself with WSS 3.0 this week and as part of that process I’ve been doing several installation in the lab. I ran into an issue on Windows 2003 Server with the installation of .NET 3.0 Framework and ASP.NET 2.0, which are required for the installation of Windows SharePoint Services.

While I had all the components installed, the ASP.NET 2.0 appeared to be missing from IIS. Our DBA has some experience with IIS and had run into a similar problem in the past, so he had the answer for me. ASP.NET 2.0 isn’t automatically registered with IIS and that problem is easily solved by running this command:

c:\windows\microsoft.net\framework\v2.0.50727\aspnet_regiis -iru -enable

I want to keep this fix handy, since my co-worker certainly saved me some time. Figured I might as well pay it forward and share it with others who may run into the same issue.

Exchange Server under the tree this Christmas?

I’ve been reading a lot about Exchange 2007 and have been thinking about what the next move for our Exchange server at the office should be. We haven’t decided on Exchange 2007 vs. Exchange 2010 yet, but no matter… I want Santa to bring me a way to eliminate all the PST files being used around the office.

We don’t have a large staff. With less than 70 people our Exchange server doesn’t work that hard. However, with the desire to bring email services back up as quickly as possible after a failure we have a policy in place that limits the amount of mail stored on the server to 250MB per user. This leaves our data store at a little over 18GB. Our last test restoration of exchange required about 2 hours for loading the database.

Contrary to this is everyone’s need to keep every scrap of every email message. This has lead to numerous PST files created as archives for all this mail. It’s pretty safe for me to assume that almost every employee has at least one PST file and they are all stored on the network shares.(Yes, I know PST storage on the network is unsupported.) My quick search yielded about 30 GB of PST files and I know I didn’t find them all.

So what exactly can Santa bring me?

First, I would be lying if I said I needed a server with more space. The current exchange server still has upwards of 180GB free, so it’s likely I could support years of user email with our current setup just by throwing open the storage limits.

I would like to have a proper email archiving system that would automatically move mail from the active mailboxes to secondary storage, thus leaving my primary database small while allowing users to seamlessly access old messages. Personally, I don’t keep much in the way of work email and I think that if my company wants me to keep mail for historical purposes, they should provide an easy way to do so. However, I haven’t managed to convince the powers-that-be that this is something to embrace quite yet.

My next choice would be reconfiguring Exchange using 2007 or 2010 to take advantage of additional storage groups and “dial-tone” mail service. If I could virtualize the mail server with a SAN for storage, I could bring basic services up in a snap(shot). By breaking up users into multiple storage groups, it would be possible for us to restore mail service immediately and then backfill the databases in small chunks. While it would still take time to restore all the data, users would be able to send and receive mail while old mail would trickle in as the storage groups come back online.

I know “dial-tone” restores are possible with my current setup, but utilizing it in Exchange 2007 or later is much easier than Exchange 2003 due to the auto-discovery features. I also would like to have at least one storage group (with only one database) per department, nearly double of the four storage group limit with Exchange 2003. With the 50 storage group limit in Exchange 2007 I wouldn’t have any problem meeting my goal. Also, Exchange 2010 has some good “starter” archiving features for mail management that might be worth a closer look.

Of course Exchange 2007 and 2010 require 64-bit hardware, so maybe Santa can bring me that new server after all.

IT Roadmap at Moscone Center

Yesterday was the Network World IT Roadmap in San Francisco. I had the experience of being the user case study presenter for the virtualization session. If you happened to catch it, I apologize for talking too fast. I’m working on that!

Other sessions covered application delivery, green IT, IP communications, data center, cloud, network management, security and compliance and WAN, LAN and mobility. Phew. Network World offered a lot in one day, plus several additional keynotes and the expo hall. My co-worker caught the WAN, LAN and mobility session, so I’m curious to see what trouble he’ll be looking to cause in the office next week.

There was some twittering happening related to the conference, but I was disappointed to see that the @itroadmap Twitter handle didn’t tweet at all during the event. They had advertised Twitter on the conference site as a way to stay connected during the conference yet didn’t reach out to that audience once. Twitter is becoming a popular way to interact as things happen – several attendees were tweeting during sessions – so it seems like Network World missed out on an opportunity there.

If You Build It, Can They Come?

I’ve posted several times about working on a disaster recovery project at the office using Server 2008 Terminal Services. We’ve officially completed the testing and had some regular staffers log on and check things out. That was probably one of the most interesting parts.

One issue with end user access was problems with the Terminal Services ActiveX components on Windows XP SP3. This is disabled by default as part of a security update in SP3. This can usually be fixed with a registry change which I posted about before, however that requires local administrative privileges that not all our testing users had. There are also ActiveX version issues if the client machine is running an XP service pack that is earlier than SP3.

Administrative privileges also caused some hiccups with one of our published web apps that required a Java plug-in. At one point, the web page required a Java update that could only be installed by a server administrator and this caused logon errors for all the users until that was addressed.

In this lab setting, we had also restored our file server to a different OS. Our production file server is Windows 2000 and in the lab we used Windows 2008. This resulted in some access permission issues for some shared and “home” directories. We didn’t spend any time troubleshooting the problem this time around, but when we do look to upgrade that server or repeat this disaster recovery test we know to look into the permissions more closely.

Users also experienced trouble getting Outlook 2007 to run properly. I did not have issues when I tested my own -there were some dialog boxes that needed to be address before it ran for the first time to confirm the username and such. While the answers to those boxes seem second nature to those of us in IT, we realized that will need to provide better documentation to ensure that users get email working right the first time.

In the end, detailed documentation proved to be the most important aspect of rolling this test environment out to end users. In the event of a disaster, it’s likely that our primary way of sharing initial access information would be by posting instructions to the Internet. Providing easy to follow instructions that include step-by-step screenshots that can be followed independently are critical. After a disaster, I don’t expect my department will have a lot of time for individual hand-holding for each user that will be using remote access.

Not only did this project provide an opportunity to update our procedures used to restore services, it showed that it’s equally as important to make sure that end users have instructions so they can independently access those services once they are available.

On my calendar – SharePoint and Virtualization

I’ve got a couple if interesting things coming up this week.

First, I’m taking a quick 2-day LearnIT! class on Windows SharePoint Services 3.0. There has been a desire to add on some collaboration tools specifically for meeting management at the office and I’m hoping this short class will get me pointed in the right direction.

Later this week, I’ll be one of the case study speakers on virtualization at Network World’s IT Roadmap 2009. If you are there, be sure to drop me a tweet @jkc137. The conference Twitter handle is supposed to be @itroadmap, but that account currently seems to be a bit spam-filled at the moment. I hope they resolve that before Thursday.