DaaS for Azure Web Sites is a great tool when “bad things happen to a good site”! It allows you to collect a plethora of different logs and parses them into an easily digestible format. The idea is to enable you to get to root cause without turning to the forums or Microsoft support. I’m running a WordPress site that, knock on digital wood, is pretty rock solid. Even so, I’m going to use this site to play around.
Until the recent announcement, the collection and parsing of PHP logs wasn’t an option, but now DaaS will collect PHP specific files along with Event Logs, Memory Dumps and HTTP Logs. First you will need to enable Web Server Logging to the File System. With that down just browse over to https://<YourSiteName>.scm.azurewebsites.net/DaaS and you’ll get a page like this…
All you need to do is click Diagnose Now and your off to the races. Oh, and you can also schedule an analysis if you’d like…
Once that is complete you can download your logs at: https://<YourSiteName>.scm.azurewebsites.net/Zip/data/DaaS.
We’ve all seen compute power sitting idle, costing us money… Well no more! Automate the creation, deployment, monitoring, and maintenance of resources in your Microsoft Azure environment using a highly scalable and reliable workflow execution engine.
Demo files at: http://aka.ms/githubautomation
Up until recently, we’ve had to use the REST API to work with Management objects in Windows Azure. This is great as all the overhead is stripped away, and anything can access it. However a .Net developer has a little extra work around generating the URLs, Authentication, and making requests. The new Management API included in the .Net SDK is a welcome enhancement. It’s still in preview, and sometimes doesn’t work as expected… Here’s a little code to get you started.
Install-Package Microsoft.WindowsAzure.Management.Libraries -IncludePrerelease
Ref: Windows Azure Management Libraries
This is a very simple demonstration for calling the Management API.
Maybe I’m just late to the party, but this tool is just awesome! Install it! Run it, and say yes to let it create the default script. AutoHotKey has it’s own “language” and I use that term loosely. Here are a couple of helpful thing to note that should get you started. The semicolon is a comment:
; Basic modifier symbols:
; # Windows key
; ! Alt key
; ^ Control key
; + Shift key
This above command will launch the default browser and navigate to Bing when you press: ctrl + alt + shift + b. I don’t think that some of these long hotkeys seem that useful and frankly I will not remember them. There are however times, when, say I’m having to modify text by pressing down, over, over, over, backspace, backspace rinse and repeat. You know what I’m talking about… This is where AutoHotkey comes to the rescue. To do just that you would:
You can even loop and other constructs… just awesome! Go forth and do.
In April we, Microsoft, announced we’re teaming up with NBC Sports to deliver digital media using Windows Azure Media Services. I’ve had the pleasure of working on the team at Microsoft that’s making this happen. As you can imagine this has been a very demanding, but exciting project to be apart of. I’ve had many opportunities to learn about Windows Azure and media in general. I hope over the coming weeks to share some of my experiences…
Wednesday I’ll be flying out to Las Vegas to work with the team to insure our users have a great experience…
Recently the folks over on the Azure team released a bunch of improvements to Azure. One of them caught my eye. They’ve provided us a way to enable PowerShell remoting when provisioning a machine. As someone who tries to automate all things I was extremely happy to see this. Gone are the days or creating images with startup scripts set, but there is a problem…
After you provision a machine with the above check box checked. You’ll probably try something like this…
And then you’ll see the following exception:
Enter-PSSession : Connecting to remote server psremoter.cloudapp.net failed with the following error message : The server certificate on the destination computer
(psremoter.cloudapp.net:5074) has the following errors:
The SSL certificate is signed by an unknown certificate authority. For more information, see the about_Remote_Troubleshooting Help topic.
At line:11 char:1
+ Enter-PSSession -ComputerName psremoter.cloudapp.net -Port 5074 -Credential $cre …
+ CategoryInfo : InvalidArgument: (psremoter.cloudapp.net:String) [Enter-PSSession], PSRemotingTransportException
+ FullyQualifiedErrorId : CreateRemoteRunspaceFailed
Let’s take a step back and look at what I’m talking about. When you enable this feature, PSRemoting is configured for SSL and uses a self signed certificate. This certificate needs to be added to your root certificate store so that you can access the machine. Let’s do this…
This script creates a ServicePoint to get the certificate. Then adds it to your trusted store. Now your free to “Enter-PSSesion” or “Invoke-Command” all you’d like!
A lot of the demos, lab environments, and test I do are in Azure. I try to script everything so that it’s repeatable and I don’t turn into a monkey clicking next. What fun is that? With that said I’ve put together how I setup my environment for communicating with Azure via PowerShell.
First things first you’ll need Microsoft Web Platform Installer. Once this is installed go ahead and open it. You should see a screen that looks like this:
Click add next to Windows Azure PowerShell and click install.
You should see the Prerequisites screen, click I Accept.
This should start the install. Once it’s complete the module for Azure PowerShell should be installed. It also installed some of the Azure SDK libraries, but this is only part of the story. Now that we have the Azure module installed we need to configure it to talk with our Azure Subscription (by the way you should have one for this to work!). Here again this is something I do a little too often, so I keep a script around that handles everything.
Let me explain what’s going on here. First we import the module you just installed. This will need to be done each time you open PowerShell unless you use the Azure PowerShell shell. [*Hint: Put it in your profile.] Next we have a couple of options. If you don’t have a certifiate you’ll need to create one. This is where the “makecert” command comes in. Once you’ve created it you’ll need to add this to you subscription. You can do this by uploading the CER file by browsing to “Settings” in the management portal. If you already have a cert you can just import it into your certificate store using the “import-pfxcertificate” cmdlet. Once the certificate is in your store you need to save it to a variable for use later. Go to the Azure portal and get your Subscription ID. Set “MySubId” to that value. The subscription name can be anything really. Your storage account name is also in the Azure portal. Go to storage an you will see one or more accounts depending on how you’re setup. Grab the name of the one you want and use it as the value of “myStorageSubscription” variable. The last two commands are where the magic happens… “Set-AzureSubscription” creates the subscription on your machine. The “Select-AzureSubscription” makes it active. You can have multiple subscriptions!
Now just run “get-azurevm” and you should see a list of VMs you have running.
Recently a customer reached out to me with a challenge. They’d just purchased a number of Surface Pro devices. Being good stewards they wanted to ensure these devices had BitLocker enabled. We solved this by using a simple PowerShell script that gathered the information then emails a log.