Hey. Today I want to talk about an interesting case that involves Kerberos, Always On VPN and access to CIFS.
A customer has recently deployed Always on VPN in their infrastructure. Most clients worked well with that but a few had mixed issues with old VPN clients installed on some machines.
Cisco AnyConnect usually worked fine when installed, but there was another VPN client that disabled the IKE/EXT service which prevented the AO VPN IPSEC to work properly. So uninstalling that software solved the issues. The customer still had that installed on some clients as a backup solutions for when IPSEC was blocked at the source (for example hotels, airports etc.)
But hey! I mentioned Kerberos, how does that come into play?
Disclaimer: This post has might have little to do with Always On VPN, but the issue manifested itself when connected through AOVPN.
Well, there was a few clients that actually connected fine. They could ping stuff on the network and everything seemed fine until they tried to access the file server. They got prompted with credentials stating that they had no access to the domain controller even though I could actually get LDAP access with AD-powershell, so LDAP was obviously working. This was interesting. After a few log checks on both sides of the fence, nothing popped out. So I decided to install Wireshark on the domain controller to try and figure things out. This gave me lots of new and critical information.
I could clearly see that Kerberos was not working. UN_SUPPORTED when the client tried to get a Kerberos ticket from the KDC. So I checked the DC logs and found issues with the Kerberos certificate.
Sorry for the lack of screens, this all happened really fast and I was definitely not allowed to screenshot the customers data.
It turned out that the domain controller was using previously issued certificates from and old and retired Certificate Authority. So I deleted all of them and issued the domain controller new certs for domain controller authentication and Kerberos authentication. Now my senses were tingling since I knew that this would fix the problems. And lo and behold, it did! The troubled clients worked right away.
But, one thing was still bothering me and is still bothering me. Since this was a server side fix, why did not all the clients have this issue? Why was only a select few clients using Kerberos auth? The customer is telling me that all computers are equal and installed from the same image and are getting the same policies. So why are only a select few using Kerberos (that failed). At the time of writing, I don’t know. This happened just recently. Maybe you have some ideas? Feel free to contact me on twitter (@mrblackswe) or post a comment below. Something tells me that the clients are not equal at all, despite what the customer is telling me (Usually the case). The clients are Windows 10 Pro 1809, afaik.
Good day to you. Today I’ve done a little write-up about my home lab equipment. I was noticing a few slowdowns once I got around 10 vm’s running on my old “server”, which was an older gen Intel E3 1240 series CPU running 32Gb of DDR3 RAM, SSD cache and spinning HDDs for mass storage. Since the stuff was closing in on 5 years of service I thought it was time to invest in some new hardware.
This time I decided to go AMD and specifically the Threadripper 1920X with 12 cores/24 threads and 64Gb of DDR4 memory. So I doubled the RAM amount and also the RAM speed and the core count tripled with higher clocks as well. All flash this time as well did it’s thing for sure as there are now 5 SSD in a combination of SATA and M2 drives in RAID-0 hosting the VM’s through storage spaces. As far as I know, the only limitation of running Hyper-V on Threadripper is that it can’t do nested virtualization, but I haven’t verified that myself yet as it is a feature I don’t specifically need.
I did not invest in networking at all since I don’t really need more that 1 Gbit externally from the host. Everything else in my network runs a singe NIC except for the NAS, which cannot get to line-speeds anyway despite having 4 ports. I could always get a 10Gbit addon-card later if needed.
So, once the new work horse was built and Hyper-V installed, it was only a matter of settings the constrained kerberos delegations correctly and start migrating machines. Live-migration was out the window due to CPU differences so I had to make the VMs “Migration enabled”. This is done for example with Powershell:
Set-VMProcessor -VMName NameOfVM -CompatibilityForMigrationEnabled 1
Note that the machine has to be turned off for this to run correctly. Once I’ve run the Move-VM command, I just run the above command again with -CompatibilityForMigrationEnabled 0 and the move was completed.
The new machine feels very much faster than the old one and a new install of WS2019 w/ desktop from the MDT took just 4 minutes to finish. I may do some iops testing further down the road but I expect that the numbers are pretty good for consumer/workstation grade hardware.
A couple of weeks ago now, I was focused on preparing for, and speaking at, Techdays in Helsinki, Finland. I was really happy to be accepted for the conference after Alexander spoke there last year and praised the arrangement. I was also very happy that Techdays choose to accept my session on Windows Virtual Desktop, since this is one of the topics I’m most passionate about and involved in currently.
I have presented this session previously, at Igel Disrupt, but this time I had another kind of audience. With more mixed backgrounds and focused more on “regular” client management. In the end, it turned out great!
I felt that I had a very good interaction with the audience and I’ve received a number of questions during and after the event. Also, the feedback has been amazing and I’m very glad and humbled by that.
So, why do I think that WVD is such a big deal? Well, I’ve said it before and to me the first and most obvious benefit is that this till democratize the, so called, EUC (End User Compute) landscape. The technologies out there today is usually pricey and fairly complicated to configure and maintain (and yes, that includes Windows Server RDS). They usually also require you to buy a number of licenses up front, or at least do the implementation as a project.
This have prevented some, especially smaller, organizations from going down this route, even though they would like to. This is made possible with WVD. You can scale DOWN to a 1 user on 1 VM if you like, and that’s fine. You don’t have any upfront cost, you can for your consumption (even thought it actually can be cheaper to buy reserved instance and pay for it upfront). It’s a very, in the simplest configuration, an easy solution with implement and manage.
You of course get all the benefits that any, or most, EUC solutions have today in terms connect-ability, security and mobility.
One of the feedback points I received both in Munich and in Helsinki were that I almost sound overly positive and don’t present the downsides of the service. For this, I’m sorry. Its actually not intentional and therefore I would like to point out a few downsides I currently see with the service (based on publicly available fact):
- Its great to run apps and desktops in the cloud, but you need to consider your apps first. This will be the showstopper for many organizations. If you have systems that required connectivity to your local datacenter as an example, its perhaps not great from a performance perspective to put the client in the cloud. You can of course see this as an opportunity as well – you are moving your stuff to the cloud, but consider that first.
- Second, authentication. Personally, I do feel that the current solution could be highly improved, but could require more cross product group work. The RDS cant sort this out by themselves, they need help from the Windows, AD and Azure AD among others. Ill dig deeper into this in time of the public preview.
- Since this is some kind of hybrid if we compare it to other solutions, we need to have tools that makes it easier to manage the service, especially the VMs. You don’t need to manage and maintain the actually underlying infrastructure – but you need to configure it, secure parts of it and manage your VMs. This will also require some cross PG work, and this (as well as security) is where I see that I personally can make a difference.
There are of course other downsides as well – and I’m really looking forward to getting more information of the final decision on licensing of the service. We’ll see.
This is however feedback I’m struggling with. I do get it, I do see it as important and I do want to be better at not just look at the good sides of it, but also (in blogs or when I’m speaking) give my audience a realistic picture. Again, I’m not trying to hide anything, its just a matter of me focusing on the amazing technology.
I’ve actually had this challenge before. In the beginning of Windows 10 I did a customer presentation on Windows 10 and why that would be the best OS for this customer. They found the presentation interesting, they saw the benefits but then they asked me a question: “So, what’s bad with Windows 10? There needs to be something, or else we wont be able to trust what you are saying.” I do get that feedback, especially now a few years later. So, moving forward ill do my best to present a more nuanced picture whatever I’m presenting on.
So, we’ll for sure have reasons to get back to WVD in coming blogposts, but for now Ill be focusing a lot of my “core” technologies which is especially Windows 10 and EMS.
Hey, the Finn here again with another random post.
A client reached out to me since they are in the process of switching print servers. They have a lot of printers, to say the least. They are all managed manually and they felt that they would like avoid the hassle of creating all the print queues manually on the new server. Powershell to the rescue!
They gave me a list with desired things that the script should do. I might have over delivered on that, but I like Powershell. It’s freaking awesome for so many things in Windows and beyond.
So, let’s have a look at what I did. Script full of comments for explanation ofc and split in regions for easy reading.
Since we will be dealing with installing printer ports, printers and DNS-records, I figured it might be a good idea to check that the script is “run as admin” or just end it right there.
Once that is out of the way we need to establish some customer specific variables such as the input file (CSV), the DNS server name, the zone in which to create the record and lastly the template printer that we want to use for custom settings.
After the initial variables now defined and checked by the script, we can move on to creating the actual functions that do stuff. So first we create the DNS-record.
This requires the DNSServer RSAT module to be installed on the machine running the script and you will not be able to run at all if that is missing since the script actually checks for that as well. However, this is really easy with Windows 10. Just run this powershell command: Add-WindowsCapability -Name Rsat.Dns.Tools~~~~0.0.1.0 -Online and you are all set.
For the next portion of the script, we’ll need to create the network port on which the printer resides. We’ll get som in-data from the CSV that we utilize here. We’ll do a try/catch approach. Also, we will try to create the printer itself right after. Log to screen and file if something goes wrong.
Now, we have the printer created and ready to go. Now we will set the permissions on it from the template printer and also any other settings that are presented in XML from the originating printer.
That would be the last step in this excercise of the functions. Now, onto the trigger of the whole thing.
We do some basic checks before kicking things off. Making sure that there is a template printer, that we have the necessary powershell modules we need and also that the input-csv actually exists, before doing anything else.
Lastly we store the variables as arguments to send to the different functions and voilá, the script is done.
Feel free to comment all of my mistakes. Download file here.
Hey, Toni here again. Some customers are still stuck on using Azure classic for their deployment. Microsoft has done a lot of work for many years to convince everyone to move over to Azure Resource Manager (ARM) model and with good success as well. Did you know that they offer good tools (well scripts actually) for the migration itself?
Microsoft has also primarily been developing all the cool new features for ARM and only maintaining the classic (ASM) model. A few years ago, not everything was available in the ARM management model but today, most of your services will run fine (better) with ARM.
The nice thing with migrating is that Microsoft actually provides you with tools to verify the success of the migration in advance, so you’ll know what you have to fix (or break, in some cases) to manage the migration successfully. The migration will check for problems, once those are fixed, it will go to a “prepare” phase. Once there, you have the option to do the migration or abort, if you wish to do that. So a very flexible operation.
One thing to note is that if you are running VM backups, those will need to be disabled before migration (and remove the extension from the VM). Once you are on the ARM model, you can re-activate your recovery plans to create new backups. The old ones will become obsolete. Also, the prerequisite check will complain about the BG-info extension, however that will be automatically removed anyway, so there is no real reason to remove it beforehand.
The only issue I’ve had with the migration thus far, was a subscription where the customer was using Azure Traffic Manager. That broke for some reason which meant I had to, quite quickly after migration, create new endpoints for the Traffic Manager. Load balancers where migrated just fine.
Once you have migrated over to ARM, you will be offered a plethora of features that you previously did not have in the portal. New services, functions, RBAC access and many, many more.
So please, consider migrating to Azure Resource Manager. It will make your life easier… and who want’s to use tech from 10 years ago anyway?
Hey there. The Finn here with another little quick update. This time on Windows Admin Center and certificates.
You are using Windows Admin Center, right? It’s good stuff. You like good stuff, right?
Self-signed is the default option, but man is that an ugly way to go.
Here is how you update your Windows Admin Center certificate without any third party tool.
Open up Powershell, run dir cert:\localmachine\my to get a list of installed certificates.
Copy the thumbprint for the certificate you want to use.
Then we will check for the application ID that WAC uses and the port it is bound to (Default is 6515) with netsh http show sslcert
Once we have the new thumbprint and the appid, we can go ahead and delete the existing certificate binding, again with netsh http delete sslcert=0.0.0.0:6515 (or the port you are using)
An finally we can bind the new certificate to the WAC port.
And that is all there is to it. However… I saw a retweet from my MVP friend Andy Syrewicze who linked to a colleague of his who wrote a little handy tool that does this switch-a-roo as well.
You can find that tool here: https://etechgoodness.wordpress.com/2019/02/28/announcing-windows-admin-center-certificate-selector/
Hi, my name is Toni aka “The Finn” from some guest appearances on the show. I was invited to write some stuff here so here goes.
I have recently been doing some Azure work and want to share some of that with you.
Building your applications and infrastructure in multiple Azure networks or even regions is a common practice nowadays. But some of you might not know that you can very easily stitch together those different virtual networks by using Azure’s own back-bone instead of creating elaborate VPN solutions that used to be the case many years ago.
Let’s say you have a production network and a developer network that you want to interact in between. Pretty much all you need to do is to go to your respective network and create the peerings. Very easy and straight forward. Within an Azure Site, you can even allow a network to utilize the gateway to give access to your on-premises networks with a check-box.
Once you have your peerings connected, you will need to make sure that your network security groups allow traffic between the networks and also, don’t forget the local firewall on your server.
That’s it! Enjoy connecting your Azure regions and networks together without complex gateways or VPNs.
This post is written on request by parts of the Power BI community. I’ve really enjoyed writing it, not because it’s a deep dive, because it isn’t. I’ve enjoyed writing it because this post will show what you can do with Intune and Windows 10, which will help another strong community to grow and make use of Microsoft 365.
So, this is the request:
How do we distribute Power BI to our users, how do we keep the clients up to date and should we choose the Store version of Power BI Desktop or the installed version?
There are a few requisites to what I’ll be showing below:
I do assume that the management and enrollment of these devices (Windows 10, iOS and Android) have been set up and are working. As well as the devices have already been enrolled (if needed).
I do expect the Power BI and Intune licenses have been assigned to the users.
The Windows devices need to be either Azure AD Joined or Hybrid Azure AD Joined.
Windows version needs to be Enterprise, Education, Business, Pro and 1607 or later.
Let’s get started!
Which version should you use – the app or the installed application?
Well, it depends on how you would like to manage it. First, the feature set in these apps are the same. You can do everything you expect to be able to do in both the app and the installed version of Power BI Desktop.
From a Windows, and a security, point of view the app (from the Microsoft Store) would be your primary choice. The reasons being that the app has been built in such a way that it will handle updates and upgrades of Windows just fine. It’s a highly secure architecture and it integrates very well with most of Microsoft’s (and others) management and security solutions.
To deploy the app, which ill show you how to do in a little while, you publish it from the Microsoft Store. That means that you don’t have to repackage it, and that it will keep itself up to date after you’ve installed it. You’ll also ensure that you always are deploying the latest release of Power BI.
The downsides on the other hand are mainly two:
- The bad thing about an app that updates itself, is that you don’t have control over the updates. You can’t control when a specific update is being released and installed on your machines.
- When you are using the app, it will automatically adapt its language to the language of the Windows 10 operating system its installed on. So, say at you have a Swedish Windows 10 Enterprise, then the app will be in Swedish as well. Which in some cases isn’t at all what we want.
On the other hand, with the installed app, which we from the start get in as an .MSI file you have to take care of your upgrades by yourself. You must repackage it, which is a simple process, but still something you need to do. This is also a fairly old, and not always streamlined way of installing applications. It’s a larger, even though small, security concern than the app and it demands more work from you as an administrator.
The flipside on the other hand is control. You control when the Power BI update gets installed on machines. Its much easier to pilot a new version to a select group of users and you can release in the pace you’d like.
You are also in control of which language Power BI will be installed in, and it doesn’t need to match the language of the operating system.
So, you are in most cases choosing between simplicity or control. Next, ill show you how to set up the distribution of both of these using Microsoft Intune and Microsoft Store for Business.
Distributing and installing Power BI using Microsoft Store for Business & Microsoft Intune
We’ll start in the Intune console, which you can find in the Azure portal (portal.azure.com). Search for Intune and when you’ve found it, navigate into the “Apps” section.
To the left you’ll see an option for Microsoft Store for Business and if this is the first time your organization is using it, we’ll have to set it up. To do this you’ll need Global Admin rights, so usually you would be required to ask on of your colleagues, but if you are “fortunate” enough to have these rights, go ahead.
Open the Business Store using the link and accept the EULA. Thereafter navigate to Manage\Settings\Distribute and activate Intune.
Next, it’s time to choose apps. In black ribbon at the top of the page, click “Shop for my organization” and search for Power BI. Once you’ve found it open it up and click “Get the app”. You’ll need to accept yet another EULA.
Now, I recommend that you add the app to your private store as well as distribute it with Intune. Therefore, next to the “Install” button you have a settings toggle (three dots) in here you can make the app available to everyone in the private part of Microsoft Store for Business.
For now, we are done in Microsoft Store for Business. Go back to the Intune portal and Save your settings for Microsoft Store for business. Refresh the page and you should now see a green checkmark and a text telling you that the connecting between Microsoft Store for Business and Intune has been configure.
In the apps list (you’ll find the link to it to the left in the portal) you should now see Power BI. Click on the line and then choose assignments. You have the possibilities to distribute it to both apps and users, in most cases you would like to distribute it to users.
When you choose to assign it, you’ll be asked to provide a group or a user. I would highly recommend using an Azure AD group for assignment.
In terms of how it will be installed, you can choose available or required. Required means that the app will be installed automatically on devices that the users you have assigned it to logs on to without any user interaction.
Available on the other hand publishes the app in the Company Portal, and the user can then self-service themselves to the app from the app.
Its up to you what service you would like to provide your users, and you can mix these two for different groups of users.
If you choose required, it will install as soon as the device have synchronized with Intune.
To deploy the Power BI app to Android or iOS you follow almost the same procedure. But you don’t, necessarily, need to configure the platforms specific stores.
You add an app using the “Add” button at the top of the Apps section. Choose your platform and search for Power BI. You’ll then assign it in the exact same way as you did with the Windows 10 apps.
Distributing and installing Power BI using Microsoft Intune and Win32-app deployment
To install the app, a few more steps are required from your side. You’ll first need to create a package that Intune can distribute for you.
To do this, download the Power BI desktop installation file and save it in a folder called “PowerBI”. Before continuing, create an empty folder named “PowerBIApp”. You can find the download here: https://powerbi.microsoft.com/en-us/desktop/
Interestingly enough if you choose the “Download” option you’ll actually be sent to the Store version, which points to which version Microsoft thinks that we should choose. So, to find the MSI file we need to choose the “Advanced Download Options”.
Download the Intune App Packaging tool from GitHub and extract the .ZIP file. https://github.com/Microsoft/Intune-Win32-App-Packaging-Tool
Open a PowerShell prompt as administrator on your machine and run the Intune packaging tool.
You’ll be asked for three things by the packaging tool:
- The full path to the “PowerBI” folder where you saved the installation file, aka “Source Folder”. C:\Users\<Username>\Downloads\PowerBI as an example.
- The full path to the installation file, aka “Setup File”. C:\Users\<Username>\Downloads\PowerBI\PowerBI.msi as an example.
- The full path to the output folder where the package will be created, aka “Output Folder”. C:\Users\<Username>\Downloads\PowerBIApp as an example.
When you have entered this information, the tool will compress the package and the output will be in the form of a .intunewin file.
Now, its time to head back to Intune and add a new app. In the Client Apps section, select add and choose to add a “Windows app (Win32)”.
Select your newly created app package and upload it to Intune.
Next, you are required to enter a suitable name, description and Publisher. Ensure that you make it clear if this deployment is intended for 32-bit or 64-bit (if you are thinking about the 32-bit one, we need to have a discussion) and which language you are installing.
In this case, and usually when we base our package on an MSI-file the installation and uninstallation commands will populate by themselves. However, due to how the Power BI Desktop MSI file is designed we need to change the original command line to the following:
msiexec /I PBIDesktop_x64.msi ACCEPT_EULA=1
Intune will then add ” /qn ALLUSERS=1” to the command line to complete it.
In the next step you must configure which requirements that needs to be fulfilled for the installation to take place. We will only configure x64 and a later version of Windows 10, but you are also able to add requirements on disk space, RAM, number of CPUs and CPU speed if you’d like.
To ensure that the installation is successful, Intune requires you to specify a detection rule. The detection rule could be a file, a registry setting or anything else that will tell Intune that everything is as it should be. The good thing about using an MSI in this case is that the MSI will tattoo a unique value for that file to the MSI, a GUID, which we can use to verify the installation. You can use this by selecting to manually configure a detection rule and add an MSI rule. This will auto-populate for you.
We could, if we would like to add custom return codes as well as tags (often used for Role Based Access Control RBAC), but for this time we can leave them.
When you then press add the package will start the upload to Intune, encrypting it on the way. Once the upload is done, we will then open the app and assign it to our users.
We do this in the exact same way as we did with the app-version of Power BI. Select a group and choose how it should be installed. The user experience is however a bit different form the app.
After the assignment, it can take a while for the devices to retrieve the instructions to install the app, but after a while it will start to download and install without requiring the user to interact with in (in the case of a required deployment).
Managing Power BI with Microsoft Intune
So, you have deployed your new apps to your machines and the next update gets out. What do you do?
For the app from Microsoft Store for Business, you don’t need to do anything. This will handle the upgrades itself as soon as the new version is available.
For the Win32 app, you need to do some more hands on work. You’ll need to create a new package for the new version and create new a new app in Intune. Before deploying the new version, you need to remove the assignment of the previous version, because when the update is installed the GUID will change and therefore trigger a new installation of the old one and so on…
So, the steps in short:
- Create a new package for the new version and upload it as a new app to Intune.
- Remove the assignment of the old version. You could create a “Pilot” group and exclude that from the assignment as well, but that’s another story.
- Create a new assignment for the new version and let the MSI take care of the update itself.
You could automate a lot of this with PowerShell, but that’s for another time
As a wrap up, your first and most important choice is to choose how to install Power BI. What do you value most? Less work or more control? There is nothing that says that you couldn’t even combine these two, its just a matter of controlling the Azure AD groups. The users will be very confused to see two different Power BI apps on their machines.
Next, is this something you as the Power BI administrator should do, or do you have a client management team that could help you out? It depends of course, but this guide will help you to get started if you need (or want) to do it yourself.
If you have any questions regarding this, let me know and ill do my best to help you out!
Don’t forget to follow me on Twitter @Bindertech as well as the #KneeDeepinTech hashtag for more great Microsoft 365 content!
Want to listen to perhaps the most bizarre Microsoft focused podcast out there? Ensure to subscribe to Knee Deep in Tech on Spotify or where you usually find podcasts!
Tomorrow I’m heading to Oslo and the Nordic Infrastructure Conference (NICCONF) – one of my favorite conferences! I’ve been invited to deliver three sessions this year:
“The force awakens – Azure SQL Server for the on-prem DBA” which is an introduction to Azure SQL Server in its different shapes,
“Azure Machine Learning for the absolute beginner” which is an introduction to Azure Machine Learning, its capabilities and what can be done with machine learning, and finally
“Learning to swim – an introduction to Azure Data Lake”, a quick overview of the what, how and when with Azure Data Lake.
Now, I’ve been running around trying to find some props for these sessions, and so far I’ve got fun stuff for two of them. Hopefully Oslo provides the third one tomorrow, and then it’ll be even more fun to attend these sessions. Let’s just say that it is surprisingly difficult to find pool toys in February…
I was invited to speak at NIC last year as well, and I was struck by the friendly feel of the conference. Not too big, not too small, plenty of great speakers, awesome attendees and very kind organizers.
I can’t wait to get there and meet all the attendees – will you join me?