Windows 7 – It’s Official

In case you’ve managed to miss it, today is the official launch of Window 7.

Some great websites to check out if you haven’t already are Talking About Windows and the Microsoft Springboard for Windows Clients. Also, check out the Windows Team Blog post from yesterday listing out some additional events and purchase offers.

Finally, if you missed out on one of live “the New Efficiency” events, there is a free virtual event on October 27th, produced by Windows IT Pro magazine.

Enjoy!

Takeaways from "The New Efficiency" Tech Series

Yesterday, I attended Microsoft’s “The New Efficiency” technical series, as part of the Windows 7/Server 2008 R2/Exchange 2010 product launch. I was a little disappointed at the turn out, since registration had been closed so early. I expected more people and generally “more” from Microsoft with all these new products coming out in just days. But I guess not every event can be hit out of the park.

That being said, there were several sponsor-led sessions that were interesting and then tracks for Windows 7, Server 2008 R2 and Exchange 2010. My original plan was to hit something from every track, but that proved difficult as the presenters from each track didn’t always keep to the scheduled break times. Thus I stuck with the server track, which was presented by Chris Henley.

Here are a few of the features that were touched on during the sessions:

  • The integrated Best Practice Analyzer covers more areas, such as Active Directory Domain Services and DNS. The BPA was mostly known for it’s use with Exchange, so it’s nice to see it expanded to other critical areas.
  • The Recycle Bin for AD. This feature makes it easier to restore deleted objects in Active Directory without having to resort to an authoritative restore, effectively extending your recoverablity of objects to nearly a year. While possible, its not recommended to reduce the lifetimes for deleted object and tombstone object below the 180 days each. Also, it’s important to note that the recycle bin feature is a schema change and it can’t be turned off once implemented. Finally, while item in the recycle bin can’t have their UPN used again until it moves out to a tombstoned object, but you can manually force items to be moved earlier.
  • In Server 2008 R2 there were changes in the core architecture which affected the networking stack to support IPv6 and IPv4 native to same Windows core protocols.
  • The Server Core installation option supports an additional role for WoW64 and IIS 7.5 also supports ASP on Server Core. Server Core has also gained a text menu environment called “S-config” to make it easier to configure basic server settings.
  • New features in Remote Desktop Services, such as virtual desktops via Hyper-V, improvements in RemoteApp, multimedia support and bi-directional audio.
  • DirectAccess as an alternative to VPNs for corporate network access. DirectAccess requires at least 4 servers and includes a setup wizard that details out how it all hooks together.
  • Improvements in Hyper-V, such as Live Migration and the ability to add some “hardware” (like Hard Drives)to virtual machines without powering them off. Don’t forget the Microsoft Assessment & Planning Toolkit, which can help minimize capital costs and reduce operating costs in your data center.

At the end of the day, the software giveaway was a copy of Windows 7 (32-bit) and the swag bag had the ever-popular XL t-shirt. Hidden among the product pamphlets in the bag was a cool gift from NetApp – a free copy of the book “Windows Server 2008 Hyper-V: Insider’s Guide to Microsoft’s Hypervisor”. Request your copy by November 20th. I’m sure the request will get you on a mailing list of some kind, but I’ll live with that for a free book.

Because It’s Already Here

A colleague of mine asked a valid question about my last post regarding how my office IT department uses ImageRight for document management instead of something else, like a Wiki. Of course a Wiki would work just fine. So would SharePoint or any other software the helps manage documentation and allows for collaboration.

I’m not saying that ImageRight is the end-all, be-all for document management. It’s just that ImageRight is what we have. One of the big topics that came up at the Vertafore Connections conference I attended a few months ago was that many companies using the product only deploy it to one or two departments to perform very specific business functions. I’ve found that it can be used by many other business areas if one just takes the time to carve out a place for their specific documentation and processes.

There is that old “law of the instrument” that can make a familiar tool look like the panacea of all problems, but I’m not trying to make an unsuitable piece of software meet our needs. We are simply using a product that our company has already invested in, instead of looking outside our existing infrastructure for a new solution. Not only does this save licensing, installation and maintenance costs for an additional product, it encourages members of our department to use ImageRight regularly, making us better able to support the other staff members in the office. We are not only supporting the backend of the program, but interacting with it as an end-user as well – a win-win for everyone.

Document Imaging Helps Organize IT

Since our implementation of ImageRight, our Network Operations team has embraced it as a way to organize our server and application documentation in a manner that makes it accessible to everyone in our team. Any support tickets, change control documents, white papers and configuration information that is stored in ImageRight is available to anyone in our group for reference.

This reduces version control issues and ensures that a common naming (or “filing”) structure is used across the board, making information easier to find. (For reference, an ImageRight “file” is a collection of documents organized together like a physical file that hangs in a file cabinet.) Plus, the ability to export individual documents or whole ImageRight “files” to a CD with an included viewer application is a great feature that I’m using as part of our Disaster Recovery preparations.

I have a single file that encompasses the contents of our network “runbook”. This file contains server lists and configuration details, IP and DNS information, network maps, application and service dependencies, storage share locations/sizes, support contact information, etc. It consists of text documents, spreadsheets, PDF files and other types of data. I keep a hard copy printed at my desk so I can jot notes when changes are needed, but ImageRight ensures I have an electronic backup that I can edit on a regular basis. Plus, I regularly export a updated copy to a CD that I add to the off-site Disaster Recovery box.

The value of ImageRight in a disaster scenario expands beyond just our configuration documents. In an office where we deal with large amounts of paper, encouraging people to see that those documents are added to ImageRight in a timely manner will ensure faster access to work products after an event prevents access to the office or destroys paper originals.

Configuring Server 2008 AD – Traditional vs. Virtual Lab Exam

There has been some recent chatter on the web about the 83-640 exam, which is the “virtual lab” version of the 70-640 exam, TS: Windows Server 2008 Active Directory, Configuring. Both of these exams are available by Prometric (at least in CA), but 83-640 is not currently listed as an official exam option for the MCITP: Enterprise Administrator or for the MCITP: Server Administrator if you refer to the MCITP certification list. However, the exam details for 83-640 note that it DOES count toward them. It’s hard to say if or when this new version will officially replace the traditional exam.

I did have the chance to sit for the pilot of this test held in late 2008, when it was numbered as 70-113. While the test did have a multiple-choice section, the sections that were done in the virtual lab were actually fun. Yes, I thought the test was fun.

It really gives someone who works a lot with Windows a chance to showcase their skills without having to memorize the exact name of the tab or screen where a setting is located, as is often the case with the regular exam format. Instead, you worked on a fully functional server, making about 10 configuration changes in each test segment. I had access to everything I would have on a “real” server – I could click around to review all the tabs, settings and tools and even had access to the help files. Once all the tasks were completed, you close out that segment and move onto the next.

The experience was as close to a true work environment as you could possibly get for a test. We all know that on any given day, we may not know exactly where to go for what needs to be done, but we certainly know it when we see it. And browsing a few tabs or pressing F1 is part of the process to jog our memories and get us back on track.

If I was given the choice to take 70-640 or 83-640 to meet my certification requirements, I’d look to take the “virtual lab” version, hands down. I hope Microsoft looks to this new format for future exams.

Catalog Error with Backup Exec

The disaster recovery project has been moving along with fits and starts. I was certainly expecting this to be a learning experience and it hasn’t failed to disappoint in that regard. Today, I kicked off a catalog of a new tape and promptly received this error:

The requested media is not listed in the media index and could not be mounted. To add the media’s catalog information to the disk-based catalogs, run an inventory operation on the media and resubmit the Catalog operation.

I ran another successful inventory of the tape for good measure, but the error remained. I rebooted the server and the tape drive. No love. Frustrating since I’ve been successfully cataloging tapes for the last few weeks.

Following the links from the error report, I turn off the option to “Use storage media-based catalogs.” By clearing the check box for this option, Backup Exec was forced to ignore any catalog information on the tape itself and build the catalog by reviewing each file on tape individually. This process takes longer, but in my case, was successful.

This is the recommended change to make when normal catalog methods fail. It’s also something you’ll need to do if you must catalog the contents of single tape from a backup job that spans multiple tapes, which can also fail if you don’t have all the tapes from the set in inventory. For more information about the differences between storage media-based catalogs and on-disk catalogs for Backup Exec, check out this additional explanation of the “storage media-based catalog” option at Symantec’s website.

Restoring IIS 6.0

I love the Internet. I use it every day. But when it comes to making websites work, it’s just not one of my strong areas. I’ve gone through a good portion of last decade working for smaller companies where being the “network administrator” meant being a bit of a jack-of-all-trades. While I don’t mind having to search for solutions to issues with software that I don’t use often, I’ve also learn which bits of the tech realm I’d rather leave to someone else. One of those is IIS.

However, this isn’t all about me hating on Internet Information Services. Last week, I actually had a experience restoring IIS 6.0 that was remarkably smooth and successful – restoring our company intranet to a different machine.

In order for this to be successful, I needed to have a portable backup of the metabase, my web folders and ASP 2.0 (which we needed for some small web-based applications). I was missing the ASP 2.0 on the base installation of IIS on the new server, but that was easy enough to correct. The web folders were getting backed up nightly, but I was missing the metabase, which was key to making this all go well.

Microsoft Technet had a rundown of how to backup and restore the metabase and this post from IT Solutions KB even includes screenshots of the process. All in all, the whole process took less than 10 steps, including making the initial backup. I was pleasantly surprised, since I expected IIS to be far more complex. I understand that IIS 7.0 is even easier, but I doubt it’ll make me what to deal with IIS regularly!

Windows 7 Beta Exam for Pro: Enterprise Desktop Admin

Through the middle of the month, IT Pros will be taking the beta exam version for the 70-686, Pro: Windows 7 Enterprise Desktop Admin. My exam slot was in the middle of last week and as far as testing goes, this one hit on every possible area you could run into Windows 7 in the enterprise.

Obviously, I can’t rattle off exam questions and this test had more than the average share of them due the the beta nature. However, I can tell you that there was at least one question for EVERY bullet point in the skills list in the exam catalog.

Because this was geared to the enterprise, general experience with AD and group policy were important, as well as WAN/LAN networking concepts and security methods. And because this is a new OS with plenty of new features, don’t plan to empty your pockets at the testing center until you know the differences between the various options for application compatibility, the range of deployment methods (including image and licensing management) and how the newer features in IE8 and Windows Server 2008/2008 R2 can affect the desktop experience.

This exam, combined with the 70-680 exam, make up the MCITP: Enterprise Desktop Administrator 7 certification. While this certification doesn’t require as many tests as the MCITP: Enterprise Administrator it’s certainly gearing up to be challenging in it’s own right, as the desktop client is the portal through which the majority of workers experience your company network.

Microsoft Security Essentials

This evening I installed Microsoft Security Essentials on my Samsung NC10. I replaced the free Avast! scanner that I’ve been using since installing Windows 7. Avast! certainly appeared to be meeting my needs, however I was hoping to lighten the load on the basic hardware this netbook is sporting.

The MSE installation was quick and easy, the longest part was waiting for the initial full scan that took about 8 minutes. The application seems very lightweight and has very few “moving parts” to configure. Outside of adjusting the schedule for the full scan and the desired actions for the various threat levels, it’s good to go. It’s advisable to check out what the “recommended levels” for the threat level responses are online (there’s a link) or in the help file, just so you have an understand of how it’s going to react. Unless you have some deep desire to review everything before it’s removed, I think the default settings will meet the needs of most.

The last setting that probably warrants a little attention is the level of information you can opt to send to Microsoft SpyNet. Now, while the name might be a little suspect, SpyNet the “community” all users running MSE must be part of to use the software. The basic setting will send information about detected malware, the action that was taken and the success or failure of that action. The advanced setting will also include the file names and location of the malware, how the malware operates and how it affected your computer. All this data is aggregated to improve signature updates and detection methods. It’s not possible to control which incidents you submit, so pick the level you are most comfortable with and accept that providing this data is part of what makes it “free” and will keep it up-to-date and useful.

Finally be sure to check out the Microsoft Security Essentials Community, part of the Microsoft Answers forums for Windows, Security Essentials and Windows Live. There are some lively threads about the feature set of MSE, as well as tips for troubleshooting and submitting possible bugs.

All in all, it seems like this product will fit right in with the other free scanners available and will be suitable for the average home user or very small business that doesn’t have a central way of managing virus and malware prevention.

24 Hours Offline – Connectivity is Addictive

I’m addicted to being connected. I admit it.

I went away with some friends for a couple days on a road trip to the Yosemite area this weekend. As soon as we left the major areas of civilization and began traveling through farmland, valleys and mountains my cellular signal became spotty and then abruptly failed.

My blackberry transformed from my link to friends, family and information into a pocket-sized camera, alarm clock and tip calculator. And while it was handy to have those things, I sorely missed my instant access to information about the sights we came across, sharing pictures and comments with friends near and far via Twitter and Facebook, and just “knowing” what was going on even though I wasn’t home making my way through my regular routine.

Instead, I enjoyed the informational displays provided by the park services about the places we visited. Shared my thoughts with those people directly around me. And much like the days before constant connectivity – I snapped photos of things to share with others later, though I wouldn’t have to wait a week to develop the film.

One of the friends joining us joked several times about my addiction to connectivity. Yet, he didn’t seem to mind when I found that 2 bars worth of the free wi-fi at our campsite trickled down to one of our cabins and I could schedule the DVR at home to record a football game he’d forgotten about out.

I went through phases of being relaxed about being cut off from the world, and phases of being frustrated by the “X” in the spot where my signal should have been. I’m glad to have had the chance to get away for this adventure, but you can bet I was thrilled when we broke out of the dead-zone and I was able watch 24 hours of emails and SMS messages flood my phone like a dam had been opened.

I think it’s okay that the stream of electronic data and the flow of the babbling brook outside our cabin door both have a place in my life. Though I think a few well-placed signs warning that “cellular coverage will end in 5 miles” would help me with the transition. Addicts can’t always go cold turkey, you know.