External Access to Azure Stack

Here is a little community gift for the new year (2017). Azure Stack expert Ruud Borst (@Ruud_Borst) recently published a blog post titled “Expose the Azure Stack Portal through NAT”. Ruud included a PowerShell script in this blog post that simplifies extending external access to Azure Stack.

The PowerShell script runs on your Azure Stack host and will make the IP mappings in NAT on MAS-BGPNAT01 to expose your Azure Stack instance externally to your network.

We no longer have to work through a bunch of tedious steps to give external access to Azure Stack. Thanks Ruud! Great example of community power. With Ruud’s script it can be done even if you already have Azure Stack deployed. The link to his blog post and script is here:

https://azurestack.eu/2016/12/expose-portal-azurestack-through-nat

Running the script is as easy as running something like this:

.\Expose-AzureStackPortal.ps1 -PortalExternalIP YOURFIRSTIPHERE -ACSExternalIP YOURSECONDIPHERE

Add -AppServiceAPIExternalIP if you are using the App Service RP you will need to specify a 3rd IP. SQL and MySQL both use the -PortalExternalIP so no need for an extra IP for these.

A successful run of the script should look like this:

VERBOSE: Created NAT external addresses 192.168.1.40 and 192.168.1.45 for Portal and ACS.

VERBOSE: Created Static NAT port mappings on 192.168.1.40 to 192.168.102.7 for Portal
VERBOSE: Created Static NAT port mappings on 192.168.1.40 to 192.168.102.12 for XRP
VERBOSE: Created Static NAT port mappings on 192.168.1.45 to 192.168.102.3 for ACS
VERBOSE: Created Static NAT port mappings on 192.168.1.40 to 192.168.102.14 for SQLrp
VERBOSE: Created Static NAT port mappings on 192.168.1.40 to 192.168.102.1 for MySQLrp

The last step in this process is to make sure you add the DNS records on your external network or to the host file on external servers or clients. Ruud explains this in his blog. I extended Azure Stack to my Buchatech lab environment so I went the DNS route.

For DNS entries I used a CSV file and PowerShell to import all of the DNS records I needed for Azure Stack. I used a PowerShell script from a fellow MVP. The blog post with that script can be found here:

http://www.lazywinadmin.com/2012/10/create-dns-entries-using-powershell-and.html

Here is what the CSV file should look like:

name ip type zone dnsserver
 portal 192.168.1.40 A azurestack.local dc.buchatech.com
 api 192.168.1.40 A azurestack.local dc.buchatech.com
 xrp.tenantextensions 192.168.1.40 A azurestack.local dc.buchatech.com
 keyvault.tenantextensions 192.168.1.40 A azurestack.local dc.buchatech.com
 health.adminextensions 192.168.1.40 A azurestack.local dc.buchatech.com
 compute.adminextensions 192.168.1.40 A azurestack.local dc.buchatech.com
 network.adminextensions 192.168.1.40 A azurestack.local dc.buchatech.com
 storage.adminextensions 192.168.1.40 A azurestack.local dc.buchatech.com
*.blob 192.168.1.45 A azurestack.local dc.buchatech.com
*.queue 192.168.1.45 A azurestack.local dc.buchatech.com
*.table 192.168.1.45 A azurestack.local dc.buchatech.com
sqlrp 192.168.1.40 A azurestack.local dc.buchatech.com
mysqlrp 192.168.1.40 A azurestack.local dc.buchatech.com
A azurestack.local dc.buchatech.com
A azurestack.local dc.buchatech.com
A azurestack.local dc.buchatech.com
A azurestack.local dc.buchatech.com

Here is the CSV file I used so you don’t have to create it.

Azure Stack DNS Entries

Notice something different I did with my DNS is I did not add *.azurestack.local. I did not do this because it caused any of the storage DNS entries to respond with the PortalExternalIP instead of the ACSExternalIP. Here is a screenshot of my Azure Stack DNS zone in my Buchatech domain:

After adding the DNS records and installing the Azure Stack certificate in the trusted root authority store I was able to access the Azure Stack portal and connect via PowerShell or Visual Studio without VPN. 🙂

Here is a screenshot of me connecting to Azure Stack’s portal from my Buchatech.com domain on one of my utility servers.

A huge thanks to Ruud for building that PowerShell script. I am excited about bringing access to Azure Stack on my other lab network because this opens up all sorts of possibilities and will net some cool blog posts very soon!

Happy Stacking!

Read More

Azure or Azure Stack “Write Once, Deploy Anywhere” Update

A while back I wrote a blog post about being able to take one IaaS VM Azure Resource Manager (ARM) template and deploy it to both Azure or Azure Stack. This blog post included a JSON file and the PowerShell to do this. The idea for that came from needing to set up a cool and working demo for MMS 2016 and the need to showcase the power of Microsoft’s HybridCloud. Here is a link to that original blog post:

Write once, deploy anywhere (Azure or Azure Stack)

Today I have finished updating the PowerShell and ARM template/JSON file to be more streamlined and to work with TP2. Here is the link to download these:Here are the updates:

https://gallery.technet.microsoft.com/Create-VM-on-Azure-or-3c6d0420

Here are the updates:

  • The JSON and PowerShell script have been modified to work with Azure Stack TP2.
  • This script now utilizes the connection PowerShell module AzureStack.Connect.psm1 from Azure Stack tools.
  • This is included with the download of this script and JSON file on TechNet Gallery.
  • The script is hard coded to look locally to import the AzureStack.Connect.psm1 module.
  • Streamlined the JSON file and PowerShell script.
  • The script no longer prompts for the publicDNS name. It is now automatically set to the same as the vmname.
  • The script no longer prompts for the storage account name. It is automatically set to vmnamestorage.
  • The script no longer prompts for the resourcegroup name. This is now automatically set to vmname-RG.
  • By default this script now uses a JSON file hosted on Github. This is set in the $templateFilePath variable as shown on the next line.
  • To keep it to the local directory just use the JSON file name.

GITHUB: $templateFilePath = “https://raw.githubusercontent.com/Buchatech/Azure-AzureStackVM/master/AzureandAzureStack.json”
LOCAL: $templateFilePath = “AzureandAzureStack.json

This will be my last blog post of 2016. See you next year folks…..

Happy Stacking!

Read More

Detailed SQL RP Azure Stack TP2 Deploy & Config

Microsoft has made a new version of the SQL resource provider (RP) for Azure Stack TP2 available. It can be found here in the documentation: https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-sql-resource-provider-deploy. This RP is an add-on for Azure Stack and allows you to offer SQL as PaaS.

This is a great SQL self-service scenario for Hybrid Cloud. The aforementioned link contains documentation on how to deploy the SQL RP. There are some “Gotchas” with the RP and some other information that is important when deploying and configuring this RP.

I am going to walk through my deployment and configuring experience covering the “Gotchas” and other important information in this blog post. This post will be broken out into the following sections:

  • Deployment
  • RP Configuration
  • Offer/Plan Setup
  • Tenant provisioning of SQL PaaS

Ok. Now let’s dive into it.

– DEPLOYMENT –

Before you begin go to the document link and review the RP documentation. You can download the RP on this page from the Download the SQL Server RP installer executable file link.

Once the RP is downloaded extract the files and scripts by running AzureStack.Sql.5.11.251.0.exe. You should have the following:

mastp2-sql-rp-1

Now from an elevated PowerShell window run DeploySQLProvider.ps1.

NOTE: Important this should not be run from PowerShell ISE. It fails when it is run from ISE and you may end up with a partial deployment that requires cleanup.

NOTE: Also you can specify a local location for the SQL 2014 SP1 Enterprise Evaluation ISO if you have it downloaded already. To do this run the script with a parameter of -DependencyFilesLocalPath. If not specified the ISO will be downloaded during deployment. I prefer to let the script download it as a part of the deployment.

This script will do the following:

The script will prompt you to input local admin account info. Note that the password you input here will also be used for the SQL SA account.

mastp2-sql-rp-2

The script will then prompt you for your Azure Active Directory tenant name. This is YOURDOMAIN.onmicrosoft.com.

You will then be prompted for an Azure Active Directory account. This should be the account you deployed Azure Stack TP2 with. This will be used to access Azure Stack and create stuff such as the resource provider, resource group and other resources needed by the RP.

mastp2-sql-rp-4

You need to enter a resource group name. You can leave the default if you want.

mastp2-sql-rp-5

You will then be prompted for the SQL server VM Name. Ignore the title of the pop-up here.

mastp2-sql-rp-6

The script will then run through all of its steps. Here is what the script does as detailed in the official documentation:

  • If necessary, download a compatible version of Azure PowerShell.
  • Create a wildcard certificate to secure communication between the resource provider and Azure Resource Manager.
  • Download an evaluation build of SQL Server SP1 from the internet or from a local file share.
  • Upload the certificate and all other artifacts to a storage account on your Azure Stack.
  • Publish gallery package so that you can deploy SQL database through the gallery.
  • Deploy a VM using the default Windows Server 2012 R2 image that comes with Azure Stack.
  • Register a local DNS record that maps to your resource provider VM.
  • Register your resource provider with the local Azure Resource Manager.
  • Connect the resource provider to the SQL server instance on the RP VM

As the script runs you will see it run through each of the steps with detail and status. Be patient. I have had this take anywhere from 30 minutes to 45 minutes. Good time to go take a break.

mastp2-sql-rp-7

Once the script is done it will show that the installation is successful as shown in the following screenshot.

mastp2-sql-rp-8

NOTE: You could run the deployment script with the required parameters to avoid the prompts. For example:
DeploySQLProvider.ps1 -AadTenantDirectoryName “YOURDOMAIN.onmicrosoft.com” -AzCredential “user@YOURDOMAIN.onmicrosoft.com” -LocalCredential “username”

If for some reason the RP deployment fails you will need to view the logs to troubleshoot. Logging will be found in: LOCATIONOFYOURDOWNLOADEDRP\SQL PaaS RP\Logs in the following format DeploySQLProvider.ps1_20161205-171516.txt as shown in the following screenshot.

mastp2-sql-rp-9

– RP CONFIGURATION –

After the SQL is deployed you can see it in the Azure Stack portal at https://portal.azurestack.local. The first thing you will want to look for is the resource group. Mine is named Microsoft-SQL-RP. Default is Microsoft-SQL-RP1. The RG is shown in the following screenshot.

mastp2-sql-rp-10

Next you will want to ensure the SQL Resource Provider is listed in Resource Providers as shown in the following screenshot.

mastp2-sql-rp-11

Go ahead and click on the resource provider to display its details. The first thing you will notice is that we don’t have any SQL Hosting Servers. This is normal and does not mean the script forgot to deploy the SQL server.

mastp2-sql-rp-12

Click on SQL Hosting Servers. Now click on + Add.
Input the same name that you put in when prompted for ...

Read More

Resource Group Clean-up in Azure Stack

If you are like me, you end up creating a ton of resource groups in Azure Stack when testing things out. I needed a way to delete them without having to click one each one via the portal. The best option of course is to leverage PowerShell. I threw together some PowerShell to handle this. I came up with two options #1 can be used to delete a bunch of RG’s that have a common name. For example, I had a bunch of VM00* resource groups. I use the script to go loop through and delete all resource groups with VMO in the name. Option #2 pop’s up a GUI window so I could select the RG’s I wanted to delete. It put them in an array and then looped through to delete them in one shot.

This is great because I can kick this off and go do something else. I will share both below in this blog post along with some screenshots. I won’t have a download for the PowerShell syntax so just copy from this post if you want to use it. Be sure to use AzureStack.Connect.psm1 for connecting to your Azure Stack environment before running any of the following code.

Code:
#1

#Create Variable of RG’s with common name
$Resourcegroups = Get-AzureRmResourceGroup | where {$_.ResourceGroupName -like (‘*VM0*’)}

#Create array of RG’s
$RGLIST = $Resourcegroups.ResourceGroupName

#Loop to remove each resource group in the array
ForEach(
$rg in $RGLIST
)
{
Get-AzureRmResourceGroup -Name $rg -ErrorAction SilentlyContinue | Remove-AzureRmResourceGroup -Force -Verbose
}

This image shows the array of RG’s that will be looped through. I highlighted vm003rg in the array and in the PowerShell status message.

rgcleanup-1

The following screenshot shows VM003RG being deleted in the Azure Stack portal.

rgcleanup-2

#2

#Create Variable of RG’s from GUI selection
$selectedrgs = (Get-AzureRmResourceGroup | Out-GridView ` -Title “Select ResouceGroups you want to remove.”` -PassThru).ResourceGroupName

#Loop to remove each resource group in the array
ForEach(
$rg in $selectedrgs
)
{
Get-AzureRmResourceGroup -Name $rg -ErrorAction SilentlyContinue | Remove-AzureRmResourceGroup -Force -Verbose
}

After running the Create Variable of RG’s from GUI selection part of the code a window as shown in the following screenshot will pop up. Select the RG’s you want to remove, click Ok and they will be placed into an array.

rgcleanup-3

Below if the output of the array. Run the Loop to remove each resource group in the array part of the code and each of the RG’s will be removed.

rgcleanup-4

I have also used this when a resource group would not delete from the portal. On some stubborn resource groups I have had to run this a couple of times. This is a short post. I hope this helps someone out!

Read More
OMS: Service Map dependency data flow

OMS: Service Map overview

Recently the Operations Management Suite (OMS) team at Microsoft announced the private preview of Service Map in OMS formally known as Application Dependency Map. Service Map has been a long awaited feature in OMS. Service Map is a feature that is a part of OMS that discovers and maps Windows & Linux app and system dependencies. Service Map displays these dependencies in application maps within OMS. Service Map did not start with OMS. It actually started as a standalone product named Fact Finder and later was integrated with SCOM. The integration of FactFinder with SCOM allowed Bluestripe to automatically create Distributed Applications in SCOM. Well Microsoft acquired BlueStripe and the rest is history.

In this post I will set out to explore and break down Service Map, how it is installed, info about the agent, how it works, key points about it, how the data flows and more. NOTE: Click on any of the images in this post to display larger in a new window. Also this post is my first effort in taking one of my PowerPoint’s and converting into a post! The following graphic describes some of the benefits of having application maps including in your monitoring solutions along with information about FactFinder:

oms-servicemap-overview-1

Now let’s take a look at what Service Map does and how it looks.

oms-servicemap-overview-2

Now let’s take a look at one of the Service Maps aka Application Maps in OMS. Notice on the left hand side the breakdown of the interface. In Service Map there is a focus machine in the center. There are front end and back end connections into that focus machine. These are the dependencies flowing in and out of the focus machine giving the mappings. Notice on the left-hand side you can control the time controls and select either a Windows or Linux machine from the list. Finally, on the left-hand side are the details of the current selection. The current selection can be a machine or process.

oms-servicemap-overview-3

Also notice that SM integrates with Change Tracking, Alerts, Performance, Security, and updates. What this means is that when you have a focus machine selected you can click on the corresponding solution on the right hand. When you click on the solution i.e. updates or security the update or security dashboard widget will be shown and you can drill down from there for further detail.

oms-servicemap-overview-4

oms-servicemap-overview-5

A common question that comes up when discussion Service Map is how does it work. The following graphic displays the process from the solution add to the actual mapping within OMS.

oms-servicemap-overview-6

Other key information about Service Map is detailed in the following graphics.

oms-servicemap-overview-7

The next graphic looks at deploying the SM agent and locations for logs. The process is as simple as downloading and installing the agent from OMS.

Here is some more critical information you need to know about the SM agent.

oms-servicemap-overview-9

This next graphic details how Service Map dependency data flows into OMS.

oms-servicemap-overview-10

At this current time Service Map supported Operating Systems at this time are:

Windows Linux
  • Windows 10
  • Windows 8.1
  • Windows 8
  • Windows 7
  • Windows Server 2016
  • Windows Server 2012 R2
  • Windows Server 2012
  • Windows Server 2008 R2 SP1
  • Oracle Enterprise Linux 5.8-5.11, 6.0-6.7, 7.0-7.1
  • Red Hat Enterprise Linux 5.8-5.11, 6.0-6.7, 7.0-7.2
  • CentOS Linux (Centos Plus kernel is not supported)
  • SUSE Linux Enterprise Server 10SP4, 11-11SP4

Service Map’s computer and process inventory data is available for search in OMS Log Analytics. This is very cool as the log analytics and searching capability in OMS is powerful and most important very FAST. Having application components, service dependencies, and supporting infrastructure configuration data at your fingertips through the log analytics gives you a powerful troubleshooting and forensics tool. I am sure over time the query capabilities will be expanded to include even more.

 oms-servicemap-overview-11  oms-servicemap-overview-12
Type=ServiceMapComputer_CL Type=ServiceMapProcess_CL

A few Service Map Log Analytic query examples:

List the physical memory capacity of all managed computers:

Type=ServiceMapComputer_CL | select TotalPhysicalMemory_d, ComputerName_s | Dedup ComputerName_s

List computer name, DNS, IP, and OS version:

Type=ServiceMapComputer_CL | select ComputerName_s, OperatingSystemVersion_s, DnsNames_s, IPv4s_s | dedup ComputerName_s

List Process Map by process name:

Type=ServiceMapProcess_CL (ProductName_s=TeamViewer)

Thanks for reading and I hope you enjoyed this post on OM’s Service Map. Now go out and add the public preview right away.

Read More

Azure Stack Deployment…No KVM…No Problem

When deploying Azure Stack (TP2) you may not have a KVM, a physical monitor, or maybe you just don’t want to use either with the host. Well there is a solution for this. You can utilize a Windows setup answer file for an unattended installation. What this will do is automate the Windows Setup for you. For Azure Stack you basically just need to input the administrator password. 🙂

Microsoft has put together an answer file and a PowerShell script that enables you to inject an answer file into CloudBuilder.vhdx before deploying Azure Stack. What this will do is enter info on the Windows setup screen for you so that you don’t have to have a KVM or physical monitor attached to the host.  You can just wait for the host to reboot and then RDP in. This unattended answer file and script is a part of the AzureStack-Tools. The AzureStack-Tools have some great resources in the repository and I will be blogging about more of them in the future.

There are basically 2 steps to inject this answer file into your Azure Stack VHDX. These are:

Step 1:

Go and download the Deployment tools files manually onto your Azure Stack host from here:

https://github.com/Azure/AzureStack-Tools/tree/master/Deployment

Or run this PowerShell from your Azure Stack host:

# Variables
$Uri = ‘https://raw.githubusercontent.com/Azure/AzureStack-Tools/master/Deployment/
$LocalPath = ‘YOURLOCATION:\AzureStack_TP2_SupportFiles’

# Create folder
New-Item $LocalPath -Type directory

# Download files
‘BootMenuNoKVM.ps1’, ‘PrepareBootFromVHD.ps1’, ‘Unattend.xml’, ‘unattend_NoKVM.xml’ | foreach { Invoke-WebRequest ($uri + $_) -OutFile ($LocalPath + ‘\’ + $_) }

Be sure to set $LocalPath to your location.

Step 2:

NOTE: You need to have the CloudBuilder.vhdx downloaded to your Azure Stack host and it cannot be mounted.

From within PowerShell navigate to the directory you downloaded the deployment tools to and run this

.\PrepareBootFromVHD.ps1 -CloudBuilderDiskPath YOURDRIVE:\CloudBuilder.vhdx -ApplyUnattend

Be sure to point the script to the location containing your CloudBuilder.vhdx before running this.

You will be prompted to enter the password you want to use for the local administrator account.

applyasunattended1

You will see the bcdedit command execution and output as shown in the following screenshot. This saves you the step of modifying the bcdedit. The CloudBuilder.vhdx will also be mounted. You will then be asked to confirm a reboot also as shown in the following screenshot.

applyasunattended2

Before you reboot if you are interested you can go see the unattend.xml file that was created. This is the answer file that will be used. This is shown in the following screenshot.

applyasunattended3

The host will be rebooted. When it comes back online you will be able to RDP in. You will then be able to kick off the Azure Stack deployment.

Happy Azure Stacking!!!

Read More

Azure Stack TP2 deployment failure 60.120.123

I recently deployed the new Azure Stack TP2 release. This install is way better. I did run into one small issue during the deployment. Below is what I ran into and the solution.

Failure in Deployment log:

2016-11-18 02:18:36 Error    1> Action: Invocation of step 60.120 failed. Stopping invocation of action plan.

Finding the root of the failure:

When walking back the step index in the summary xml log the error landed on step 60.120.123.

-<Task EndTimeUtc="2016-11-18T08:15:23.1042963Z" Status="Error" StartTimeUtc="2016-11-18T08:10:40.5896841Z" ActionType="Deployment-Phase4-ConfigureWAS" RolePath="Cloud">

-<Action EndTimeUtc=”2016-11-18T08:15:23.1042963Z” Status=”Error” StartTimeUtc=”2016-11-18T08:10:40.5896841Z” Type=”Deployment-Phase4-ConfigureWAS” Scope=”Internal”>

-<Steps>

-<Step EndTimeUtc=”2016-11-18T08:15:23.1042963Z” Status=”Error” StartTimeUtc=”2016-11-18T08:10:40.5896841Z” Name=”(Katal) Configure WAS VMs” Description=”Configures Windows Azure Stack on the guest VMs.Index=”123“>

-<Task EndTimeUtc=”2016-11-18T08:15:23.1042963Z” Status=”Error” StartTimeUtc=”2016-11-18T08:10:40.5896841Z” RolePath=”Cloud\Fabric\WAS” InterfaceType=”Configure”>

-<Exception>

<Message>Function ‘ConfigureWAS’ in module ‘Roles\WAS\WAS.psd1’ raised an exception: Time out has expired and the operation has not been completed. at Stop-WebServices, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 699 at Restart-WebServices, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 712 at Invoke-Main, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 649 at <ScriptBlock>, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 738 at <ScriptBlock>, <No file>: line 21</Message>

<StackTrace> at CloudEngine.Actions.PowerShellHost.Invoke(InterfaceParameters parameters, Object legacyConfigurationObject, CancellationToken token) at CloudEngine.Actions.InterfaceTask.Invoke(Configuration roleConfiguration, Object legacyConfigurationObject, MultiLevelIndexRange indexRange, CancellationToken token, Dictionary`2 runtimeParameter)</StackTrace>

<Raw>CloudEngine.Actions.InterfaceInvocationFailedException: Function ‘ConfigureWAS’ in module ‘Roles\WAS\WAS.psd1’ raised an exception: Time out has expired and the operation has not been completed. at Stop-WebServices, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 699 at Restart-WebServices, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 712 at Invoke-Main, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 649 at <ScriptBlock>, D:\WAP\Setup\Scripts\Configure-AzureStackMasd.ps1: line 738 at <ScriptBlock>, <No file>: line 21 at CloudEngine.Actions.PowerShellHost.Invoke(InterfaceParameters parameters, Object legacyConfigurationObject, CancellationToken token) at CloudEngine.Actions.InterfaceTask.Invoke(Configuration roleConfiguration, Object legacyConfigurationObject, MultiLevelIndexRange indexRange, CancellationToken token, Dictionary`2 runtimeParameter)</Raw>

</Exception>

</Task>

</Step>
Solution:

The first option is to re-run the deployment from the specific failed step. Do this by using the following syntax:

Import-Module C:\CloudDeployment\CloudDeployment.psd1 -Force

Import-Module C:\CloudDeployment\ECEngine\EnterpriseCloudEngine.psd1 -Force

Invoke-EceAction -RolePath Cloud -ActionType Deployment -Start 60.120.123 -Verbose

The second option for this specific issue is to re-run the deployment with network parameters included. Use the following Syntax:

.\InstallAzureStackPOC.ps1 -AdminPassword $adminpass -AADAdminCredential $aadcred -AADDirectoryTenantName X.onmicrosoft.com -NatIPv4Subnet 192.168.5.0/24 -NatIPv4Address 192.168.5.3 -NatIPv4DefaultGateway 192.168.5.1 -EnvironmentDNS 192.168.5.1 -Verbose

Read More

Fun @ the MVP Summit 2016

This year at the MVP Summit was a great one.

I learned a lot of stuff mostly about OMS, System Center, and Azure Stack.

I cannot talk about any of it. 🙂

I can however talk about some of the fun times we had and share some pictures.

 

First picture….a warm welcome to MVP’s from around the world.

image001

Here is a picture of the US MVPs at the summit!

all-us-mvps

Me at the Microsoft Enterprise Engineering Center in Redmond.

image003

image005

A room full of talented MVP’s! Check out the cool US MVP jersey’s.

image007

The annual Concurrency MVP dinner. We have 12 MVP’s at Concurrency now!

image009

image011

Me with MVP and SCOM guru Scott Moss.

image013

Azure Stack Power in the house or should I say data center. MVP’s Mark Scholman, Florent APPOINTAIRE and me.

image015

Picked up some really cool Azure Stack stickers.

image017

Having fun with MVP and SCOM master Tao!

image021

With MVP’s Kurt Van Hoecke, Jakob G. Svendsen, and Tao.

image023

With long time MVP and SCOM godfather Cameron Fuller.

image025

With MVP Annur Sumar.

image027

MVP and Service Manager master Andreas Baumgarten.

image029

Solving the world’s problems with David ...

Read More

Download all Azure Stack Ignite 2016 decks

Mattias Fors has a post with a PowerShell script that can be used to download all Microsoft Ignite 2016 slide decks. The blog post is here: https://deploywindows.info/2016/09/30/download-ignite-2016-slidedecks,

You can use this script to download all the Azure Stack session decks.

2016-10-04-14_59_34-windows-powershell-ise

Just change if ($item.title –notlike “Re:*”) to if ($item.title -like “*Azure Stack*”) in the script and run it. It will place them in C:\Ignite2016Slidedecks.

2016-10-04-15_01_47-c__ignite2016slidedecks

Enjoy!

Read More

Breakout of the Cold & into MN tech

Minnesota has been a hotbed of tech for a long time. In 2015 Minnesota was named one of The Fastest-Growing States For Tech Jobs In 2015 by Forbes. Link here:

http://www.forbes.com/sites/susanadams/2015/08/18/the-fastest-growing-states-for-tech-jobs-in-2015

Also in 2015 Minnesota landed #1 on Dice.com’s Fastest-Growing States for Tech Jobs. Link here:

http://media.dice.com/report/august-2015-fastest-growing-states-for-tech-jobs

Most recently within the past few Months Amazon has opened an office in Downtown Minneapolis with 100 full time tech positions giving the local tech community another boost on the national scale. Here is an article that covers the topic of companies such as Amazon expanding into this market to tap into the rich technical talent pool. Article: High-tech talent grab Link: http://www.bizjournals.com/twincities/news/2016/09/30/high-tech-talent-grab.html

Back in the 1960’s Control Data Corporation one of the nine original major computer companies in the US was headquartered in Minnesota and later in the 1970’s Cray super computer also came out of Minnesota. In recent years it’s been heating up even more and the word has been getting out.  Minnesota’s tech scene is stronger than ever with a number of high profile startups as well as many well established tech based organizations. Some of the hot tech startups include:

Code42

JAMF Software

The Nerdery

Leadpages

Upsie

Vidku

Field Nation

And other notable tech organizations are:

SPS Commerce

Stratasys

Optum

Lawson Software

Compellent (Acquired by Dell some years back)

Digital River

The list of startups and other large tech organizations could fill up an entire blog post itself so I had to limit the list.

I am from Minnesota and proud to be a part of this thriving tech community. Recently a documentary about the tech scene in Minnesota was released named DocuMNtary.

documntary

This film was produced by a techie named Nick Roseth, music done by the MN super hip hop collective Doomtree, and narrated by legendary MN hip hop emcee Dessa also of Doomtree. I was impressed that Nick pulled in Doomtree and Dessa to help with the film. A great move in my opinion. They help bring an artistic and authentic MN feel to the film’s creative side.

They kick off the film by getting the obvious out of the way….Minnesota’s cold weather. Once we break out from the cold it is time to focus on the culture and tech. Next they examine what makes Minnesota great, why people stay when they come to MN and the great things about the culture. They also touch on why MN tends to be a collaborative culture. The film then moves into the history of tech in MN and establishes the roots.

They continue through the film showcasing several startups, how the tech ecosystem is supportive, tech training, associations/government agencies that help facilitate tech in MN, events such as MN Cup, Startup week, and more. The film calls out a Minnesota focused website named Tech{dot}MN http://tech.mn. Tech{dot}MN is the go to for all things MN tech such as events, user groups, startup and other tech news. They even address the issue of diversity in tech and what is happening in MN to help bridge this gap.

In the film they interviewed 50 players in the tech scene from a variety of companies and organizations. Here is a screenshot of all the featured people.

50people

Towards the end of the film it was admitted that Minnesotans are not the best at telling our story due to our Midwestern and humble nature. There was a call to action for viewers that are in tech and from Minnesota to do some bragging and get the word out about the magic happening in tech. This prompted me to write this blog post! I hope you enjoyed this breakdown about the DocuMNtary film. The film website is: http://www.documntary.com

You can watch it here:

I also want to call out some things that were not covered in this film that are happening in Minnesota tech. We have a thriving community in the Microsoft space. In fact Minnesota is 1 of 20 locations in North America that Microsoft has chosen to place one of their Microsoft Technology Centers (MTC). You can learn more about the MTC here: https://www.microsoft.com/en-us/mtc/locations/minneapolis.aspx

Minnesota is home to 24 Microsoft MVP’s including myself. Here are some of the names of our local Microsoft MVP’s.

Brian Mason

Nash Pherson

Tim Curwick

Ryan Ephgrave

Tim Star

Paul Timmerman

ASP.NET MVP: Robert Boedigheimer – Blog: http://weblogs.asp.net/boedie

Data Platform MVP: Dan English

Will Smith

Scott Hamilton

Wes Preston

Cloud and Data Center Management MVP: Greg Shulz – Blog: http://storageioblog.com

Each of these Microsoft MVP’s are highly talented in their respective areas of technological expertise and Microsoft has selected them for this. I am proud to have such a high concentration of MVP’s in Minnesota. Learn more about the MVP program here: https://mvp.microsoft.com/en-us/overview

Minnesota can boasts about some of the highest turnouts for our user groups and events.

SharePoint Saturday Twin Cities is the biggest one in the US with attendance often reaching 800+. More about this event here: www.spstc.com

MN SQL Saturday is an annual event that has been around for some time. This event typically attracts 450+ attendee’s with many MVP’s coming out to present. http://www.sqlsaturday.com/557/EventHome.aspx

Midwest Management Summit (MMS) has been around for 4 years. It is held at the Radisson Blue Mall of America and has sold out every year.  Experts and attendee’s come from all of the US and the world to be a part of this magical event around Microsoft management technologies on premises and cloud. MMS has experts come in from Ireland, Denmark, Sweden, United Kingdom, Canada and more. Here is an old Microsoft blog post from Microsoft on MMS https://blogs.msdn.microsoft.com/mvpawardprogram/2015/11/16/mvps-from-around-the-world-come-together-again-for-mms-2015-in-minnesota and the offical website here: https://mmsmoa.com.

Some of the MMS folks also are involved in the MN System Center User Group (MNSCUG) https://mnscug.org and Minnesota Azure User Group http://www.mnazureusergroup.com. It is not uncommon for MNSCUG to host full day events with 100+ attendee’s and the MN Azure UG to get 50+ attendee’s.

DevOps Days Minneapolis is another event held in Minnesota that attracts a large crowd and speakers from all over! More about this event here: https://www.devopsdays.org/events/2016-minneapolis

On top of that Minnesota has SQL Pass, IoT UG’s, Twin Cities MAC Admins Meetup, Amazon AWS UG’s and more. A full list of user groups and events can be found on Tech{dot}MN.

Beyond the just the tech community eco-system If you want to launch a startup, work in corporate tech, work for a partner company to one of the big tech companies (Microsoft, Google, Amazon, IBM), work in open source, work as a developer, or even freelance Minnesota has a place for you.

So to wrap up this post the next time you think of MN go beyond the perception of the cold, our numerous sports teams, and the 10k lakes remember this is a tech hot bed and its only getting hotter all the time!

Read More