I’m working with PowerShell, and digging it. I decided that I wanted to create a new script (blog post later, if I get it to work) and I wanted to try out different scripting tools to build it. I started with PrimalScript from Sapien. It’s slick. It’s powerful. It has very nice code completion, a great GUI, integration with source control out of the box.
I started scripting and everything was fine. Then, I needed to run invoke-Sqlcmd so I decided to change the profile on the shell so it would load the SQL Server cmdlets automagically. I started getting the error:
Get-ItemProperty : Cannot find path ‘HKLM:\SOFTWARE\Microsoft\PowerShell\1\ShellIds\Microsoft.SqlServer.Management.PowerShell.sqlps’ because it does not exist
Hmmm… weird. I’m pretty sure I used the same command on my workstation as on my laptop. So I checked it out. Sure enough, same command. So I navigated through the path to ensure everything was in place. I opened up a seperate shell on the laptop and on the workstation. They both showed the path just fine. I couldn’t figure it out. So I opened up the registry and browsed through directly. Yep, the path was there. I opened a shell directly from PrimalScript. I got the error again. Then I tried to navigate to my path. No love. Why?
Then I noticed, in the path in the window that opened from PrimalScript, two little numbers, 32. It’s running the 32 bit shell. Funny enough, the install of SQL Server is 64 bit. It’s visible to the 64 bit shell, the one I ran, but it’s not there to the 32 bit shell. Now I get to track down how to configure SQL Server cmdlets to be visible to both the 32 and the 64 bit shell. Fun.
Idera’s PowerShell Plus has a 64-bit version. I’m just trying to get Quest’s PowerGUI installed, but it looks like it’s 32-bit. This is something to keep an eye out for when choosing and configuring your editor.
I’ve had the opportunity this week to take class with Don Jones (blog | twitter), PowerShell guru and author of PowerShell 2.0 TFM (makes me laugh, each and every time). I’m not even going to attempt to try to teach you some of the amazing information Don has spent the last week pouring into my ears (since most of it spilled on the floor). But, I do want to pass on some high points, gotchas, tips, insights, that Don communicated to us during the class because they might help you pick up on the fundamentals of PowerShell.
That’s the most important lesson I got out of this week, you need the fundamentals to really start to put PowerShell to work. It’s very easy to make the mistake that I did. I went into PowerShell with a programmer’s background. I have built applications with Paradox Application Language (PAL & OPAL), Basic, Visual Basic, Java, and C# and I’ve been working with the VB-Script scripting language for years including work setting up custom rules in Operations Manager. So, I attacked PowerShell like it was just another language. Problem is, PowerShell isn’t completely just another language. It really does have some fundamental differences that you need to be aware of or you will get side-tracked, as I did. Getting side-tracked will not only keep you from properly learning the abilities of PowerShell, but you’ll miss out on major benefits that the proper construction of PowerShell scripts will convey to you.
What constructs? The one key point that I think I was missing was the fundamental importance of the pipe. It’s not simply a neat way to pass information from one function to the next, it’s how you should be doing a lot of the work. I wasn’t, but I will.
Here, largely taken from my notes, so these are not Don’s precise words (all the time), are some things you might want to know (any & all errors are mine and mine alone):
- No one is going to be bleeding from the eyes at the end of the course. Great news since I usually bleed from the ears after drinking from the Itzik Ben Gan firehose of information during his classes.
- Don is convinced that it is Microsoft’s intention to move to a two tier structure of haves & have-nots consisting of the first class citizen of PowerShell users and the second tier citizen that uses only GUI’s. I commented on that at length here.
- Law #1: Thou shalt ask for help. By this he didn’t just mean asking him for help during the class, but learning how to use PowerShell help. He’s right. The more you know how to access the help system, the better.
- Use Consolas instead of the raster font. It really did help.
- If you find yourself messing with text, you’re doing it wrong. Which pretty much invalidates half the scripts I’ve written so far in PowerShell.
- F-7 to get a quick list of commands. Works wonderfully.
- Set up a commandlet of the day calendar. He suggested tossing the Dilbert daily cartoon calendar and setting up a commandlet calendar so that you look at a commandlet in detail each day as a learning tool. I’ll be working on mine.
- If an object binds by PSObject (or psvalue, can’t quite read my notes) it will always only bind by psobject despite what the documentation says. Ok.
- Invoke-Command and remoting are too flipping cool. Parallel execution of the scripts.
- Security is built into the product:
- script exec is prohibited by default
- Ps1 is not associated with the shell
- must provide a path to a script
- underlying security is not bypassed
- Get a certificate then use and generate nothing but signed scripts. This makes sense and I’ll go to work on it. It’ll change how I post scripts here, but I’ll cope.
- Don’t modify things out of scope. This is actually a fundamental bad behavior. You can modify stuff out of scope, but that doesn’t mean you should.
- PowerShell GUI’s are possible, but if you need a multi-form app, there is a wonderful product called visual studio.
- He had a function he called a “getto breakpoint” he used in v1. I just liked the term.
- If you’re using FOREACH all over the place, you’re doing it wrong. And there go the other half of my scripts….
- If all you’re doing is writing C# code in PowerShell, then use C#.
- While we can do a ton of cool stuff with SQL Server through PowerShell, it’s actually exposed a very stunted set of methods & processes into PowerShell. As a matter of fact, running a query is the one thing it really exposed, which is why I’ve been using SMO so frequently.
- My words, not Don’s, the ISE stinks, get an editor
- Trying out Sapien‘s PrimalScript because of the class… it might replace Idera‘s PowerShellPlus in my tool box. We’ll see.
I’m sure there were better & more important things that I should have picked up, but this is the list of items that stood out. It was a great opportunity to get to learn this from Don and I heartily recommend you track down his classes and attend them. You won’t be sorry.
If you haven’t heard, please allow me to let you in on a little secret. The Professional Association of SQL Server users (PASS) is holding open nominations for the board. If you are interested in running for the board or know someone who should run for the board, get to this location as soon as possible and get their name in.
I feel that PASS does a lot for individual database users all over the world. I know people who take the contrary position and say that PASS doesn’t do anything for them. If you’re of that opinion, that’s great. As a matter of fact, if you’re of that opinion, I’d suggest you should run for the board. Now is your chance to make a difference. Now is the time for you to provide guidance and leadership to start to turn the aircraft carrier that is the PASS organization. Now is your opportunity to help your fellow database professionals with whatever it is that you think the PASS organization should be providing, but isn’t. Now, right now, is when you need to come forward, toss your hat in the ring, put your name in, get involved, make a difference.
Please consider this a personal invitation to help chart a new course for the SQL Server community. You’ve been asked, please take action.
I’ve been attending a Powershell fundamentals class with Don Jones (blog|twitter). If you read my blog you might be aware of the fact that I’ve posted a few PowerShell scripts in the past. So why was I attending a fundamentals class? Because I didn’t know what I was doing. I knew going into the class that I needed a better grounding in the fundamentals of Posh, but after the first day of Don’s excellent class, I realized that I had been working with PowerShell and didn’t have a clue how it really worked.
Don’s class is excellent and I could spend a lot of time talking about just that (which I’m sure would make Don happy). However I want to concentrate on something that he said during class that really resonated because I think it’s true. You don’t hear “true” things all the time, so when one jumps up and bites you, it’s worth paying attention. I don’t have his exact quote written down, so this will be more than a bit of a paraphrase. Don laid down the argument that Microsoft is creating a two tier structure where low level admins will have a GUI and the real experts will be using PowerShell. He showed how some of the functionality available in AD is already split and he said that more and more products coming out of Microsoft are going to be showing this same split.
So why do I see this as important? Well, it’s simple really. I don’t want to be in the low level, second tier, LOWER PAID, MORE EASILY REPLACED, set of knowledge workers. I want to be in the group that really can make things go. I want to ensure my employability into the future (at least until I win the lottery… which will happen right after I start playing it). You may not like PowerShell. You may not like Microsoft, SQL Server, Windows or any of that stuff. You may be a devoted Mac-head (like Don, Brent, Aaron..). But if your job is related to managing Windows servers, SQL Server, Operations Manager, SharePoint, Exchange… you get the idea, regardless of how you feel about PowerShell and the rest, if you want to be better, more powerful, and more employable, you need to learn PowerShell.
The good news is, it’s not that hard. The better news is, you can contact Don and get into one of his classes. The even better news is, if you’re a lazy, good for nothing lay-about, like me, learning PowerShell will make you more efficient, which means more time to lay around doing nothing while still looking good for the boss. So for those of you who believed that they just didn’t need to learn PowerShell… well, you’re wrong. Go get started. I’d suggest learning the fundamentals first, not just hacking away at it like I did. Don’s book might be a good place to start (TFM…. [snicker]).
And in case you can’t make that, Thursday, June 17th, I’ll be on Quest’s Pain of the Week. It should be fun too.
The company where I work has been using virtualization in development, QA, testing, etc., for many years now. We have not even attempted virtualization in production. Things change. But, before we did it, I thought I’d bounce the question out to the people who know, What are some things to watch for in using virtual servers in a production environment.
Flat out, I asked people over Twitter. The overall message was extremely positive. No one suggested it was a bad idea. A few thought that some high volume servers are better served by physical boxes. Volume in this case could be read as high amounts of reads or high amount of writes, both were suggested as reasons to avoid virtualization. The cause for this is clear, from each individual; the number one problem was around I/O.
I put out a general call for information and received a pretty good set of responses. I’m not expert enough in all this technology to put together a single summary, so I’ll more or less present the information as it was presented to me. When all the information was bounding past, a few people asked me to aggregate it, so here it is.
What to watch for
According to Allen Kinsel (blog|twitter), a DBA for a multi-line insurance company who manages Oracle, Sybase and SQL Server and is very experienced with virtualization in support of all his products, I/O can be more than double from a virtual machine. With this in mind, Mr. Kinsel and Brent Ozar (blog|twitter), the second non-Microsoft employee to be certified as a SQL Server Certified Master (MCM) in North America (I think that’s the claim), both recommend avoiding the use of raw LUN’s unless you experience severe I/O binding issues. Brent and Allen both recommend you get limited VSphere access in order to closely monitor I/O through the VMWare interface because the normal I/O measurements would not be accurate (Allen suggests buying the VM admin some beer, probably excellent advice). Brent goes on to outline a few other specific points. Set the minimum memory reservation on the VM to avoid the balloon driver telling SQL it’s out of memory. Brent suggests that you use as few virtual CPUs as possible. If the physical box averages 25% CPU load across four physical cores, then only add 2 virtual cores. Brent also suggests using IOMeter or SQLIO to test the configuration. If you’re maxing at 100-125mb/sec then you’re looking at an I/O bottleneck and will need to make adjustments to the system. Brent says that with an fiber channel SAN such as we use that we should be fine with I/O as long as we’re not using physical boxes in active/active multipathing and that we have more than 2 HBAs per physical box.
- Disk I/O
- CPU Queuing on the host
- Memory over commit
- Disk I/O over commit
- CPU over commit
Several people, rather than send me stories, or in addition to the stories, simply pointed out documentation of which you should take advantage. First, and most important of these from the stand point of SQL Server and virtualization is that the Professionall Association of SQL Server users (PASS) has a special interest group dedicated to virtualization: http://virtualization.sqlpass.org
Disk Performance Information:
San Multipathing (introductory): http://www.brentozar.com/archive/2009/05/san-multipathing-part-1-waht-are-paths/
SAN Performance Tuning with SQLIO
Virtualization and SAN Basics
How Microsoft IT Reduces Operating Expenses using Virtualization http://blogs.msdn.com/b/nickmac/archive/2010/05/28/how-microsoft-it-reduces-operating-expenses-using-virtualization/
SQL Server Performance on Vmware vSphere: http://blogs.vmware.com/performance/2009/06/sql-server-performance-on-vspehere.html
That’s it. I won’t pretend that I understood everything. I’m passing it along in a semi-structured format as a public service. I also received some information from the MVP back channels, but I can’t publish that without permission. Suffice to say, while extremely useful, none of it changed the overall picture as represented by the information listed above.
I’ve been playing with SQL Server 2008 R2 for quite a while in the CTP’s and what not. But, I hadn’t made a concerted effort to look at the new version of Reporting Services… HUGE mistake. There are a number of, not insubstantial, updates to Reporting Services that, probably, are the primary selling points of 2008 R2. The big one, for me, was the incorporation of mapping directly into reports. Did I say big? I’m sorry, I meant, enormous, gigantic, galactic, really, really important… Why you ask? I work for an insurance company. We insure factories, warehouses, that sort of thing. Funny bit of information about a factory, when floods, high wind or earthquakes come calling, they just don’t seem to be able to get out of the way adequately. Weird, huh? With that in mind, it’s really useful to be able to know, not simply where a building is located, but if that location is in a 50 year or a 500 year flood zone, is it subject to high winds, does the New Madrid fault line run straight across the factory floor… Because that factory isn’t going anywhere, you want to know how it should be built to withstand that flood, wind, earth movement, etc.. In other words, spatial data is a big deal where I work.
Using maps in reports is incredibly simple. First you need a query. Here’s one that runs against AdventureWorks2008R2:
SELECT a.City , a.PostalCode , a.AddressLine1 , sp.Name , a.SpatialLocation FROM Person.Address AS a JOIN Person.StateProvince AS sp ON a.StateProvinceID = sp.StateProvinceID WHERE sp.Name = 'Arizona'
With that query, creating a report is simple. Open up BIDS, add a data source, create a report, and plug that query into it. Click Next, Next, Next however many times you need to in order to arrive a report. You can blank it out, removing all columns or leave it alone, any way you want, but, click on the Tools menu choice and select Map. Click on the report to place a map and you’ll get this dialogue box. You can then drill down a select a number of different maps, built right into Reporting Services, just as I did for the map of Arizona:
Once you have the map you want, click Next again and you’ll see a layout of what the map piece of the report is going to look like. You can work with a number of options, up to and including adding information from Bing Maps for, for free. This is slick stuff:
You can finish from there and you get a map. But, there’s no data on it. Now you need to click on the map and add a layer, again, you can use a wizard, the Layer Wizard, to select a spatial query, maybe the one we created at the top of this experiment. This is then placed on the map as points or shapes, depending on the data. Below is the finished map:
No big deal right? Except, this was free, I didn’t type a bit of code, I didn’t set any properties or muck about with all sorts of menu choices or do much of anything at all, but I ended up with what could be a useful report. It took longer for me to type this sentence than it did to create the map. Imagine what you can do if you actually tried to put some work into it. This is a fantastic new resource and one that I’m going to be spending a lot of time exploring. I think many of you will find it useful as well.
Or maybe I’m misreading that I’m supposed to be helping to alleviate pain. Either way, next Thursday, June 17, I’ll be taking part in the Quest Pain of the Week webcast: Easy Ways to Fix Hard SQL Server Problems – Without a Guru. It should be a lot of fun. Please stop by and ask hard questions.
Since all the cool kids seem to be posting the sessions that they submitted to the PASS Summit, nerd that I am, I’ll follow along and do the same. I submitted four sessions, two by invitation for a Spotlight session and two for regular sessions. I tried to branch out a bit from where I’ve been in the past to see if I can start talking about different topics. With that in mind, the first session was: Spatial Data: The Business Case
We’ve all seen the cool presentations showing all the pizza joints near the conference center or all the bicycle shops on a biking route, but what’s the case for spatial data and business? This session sets out to show how spatial data can be of interest to almost any business that has more than one location to worry about. Are weather events or natural disasters affecting your locations? Without the ability to precisely map them, how can you know? Is your new building on a flood plain? If you’re planning a visit to a customer, are there other customers near by that you could also visit? The session focuses on a business case for using spatial data, but it also shows some of the technical means that spatial data can use to solve those cases using the spatial data type, Microsoft Bing Maps and Reporting Services.
The next “different” session was: Database Deployment in the Real World
Deploying databases can be a difficult challenge. This session will provide a general approach to database development and database deployment that seeks to alleviate the issues around getting databases deployed. There will be general methods on display, such as the use of source control as a part of database development, and scripting methods for production deployments. There will also be specific methods using a variety of tools to meet this deployment methodology. Tools from Microsoft, such as Visual Studio and the DTA, and various 3rd party vendors will be demonstrated. The goal of the session will be to provide mechanisms for attendees to apply to their own databases in order to arrive at a safer and more reliable deployment process.
After that I went back to my usual topics: Identifying and Fixing Performance Problems using Execution Plans
This session will demonstrate how SQL Server execution plans can be used to identify problems with the database design, or the TSQL code, and address those problems. The session takes the user through various common issues such as poor or missing indexes, badly written code and generally bad query performance, demonstrating how to identify the issues involved using execution plans. The session will then demonstrate different methods for addressing the issues and show how the fixed query’s execution plans differ. Multiple methods for accessing execution plans including graphical, DMV’s, and trace events are demonstrated. All this is meant to lay a foundation for a general troubleshooting approach to empower the attendee to make their own queries run faster and more consistently.
And even recycled a session from last year, again, as an experiment: DMV’s as a Shortcut to Procedure Tuning
Dynamic Management Views (DMV) expose a wealth of information to the database administrator. However, they also expose information that is vital to the database developer. More often than not people gather performance metrics through server side traces. This session will show how to gather information from the DMVs for currently executing, and recently executed queries. The session will demonstrate combining this information with other DMVs to get more intersting information such as the query plan and query text. I’ll show where you can get aggregate information for the queries in cache to determine which queries are being frequently accessed or using the most resources. I’ll show how to determine which indexes are being used in your system and which are not. All of this will be focused, not on the DBA, but on the query writer, the developer or database developer that needs information to tune and troubleshoot data access.
There are absolutely glorious sessions that have been submitted this year (and if you haven’t looked them over, you should). The competition is going to be fierce, which will make for an excellent Summit.
Oh, and don’t forget that Wednesday is Kilt Day at the Pass Summit.