Shows on Microsoft Docs

So, I was basically on-month-old-ago when I discovered (where have I been?) Shows on Microsoft Docs! It’s the evolution of the Channel 9 website with all kinds of videos and resources across a myriad of topics. I’d reckon there are literally thousands of hours of content on there!

I’ve come across advanced topics such as “Quantum-inspired algorithms and the Azure Quantum optimization service“, but what I’ve latched onto is the “Beginner’s series” – a collection of instructional videos to get a newbie started on a new concept or technology. To name a few:

Note how little those topics are Microsoft- or Azure-specific? Learning goldmine!

Learn. Connect. Code. @ Microsoft Build May 24-26, 2022

Microsoft Build | May 24-26, 2022

It’s that time of year again! If you haven’t been hiding under a rock (or avoiding Twitter), you may already know that Build is right around the corner!

Be sure to register so you can build (pun intended) your schedule, and tune in to these amazing speakers (just a sampling below):

  • Satya Nadella
  • Kevin Scott
  • Amanda Silver
  • Scott Guthrie
  • Brendan Burns
  • Alysa Taylor
  • … and more!

I hope to virtually “see” you there!

How to Activate your GitHub Benefit in your Visual Studio Subscription

You’ve got a Visual Studio Subscription – great! Next, you’ve got GitHub as part of a new benefit – even better!

Now, how do you activate that benefit and get going with Visual Studio and GitHub for a “better together” experience?

Simple: Walk through this article: Set up Visual Studio subscriptions with GitHub Enterprise | Microsoft Docs which will help you activate your GitHub benefit, get your organization set up, and step off in the right direction.

Want more? Check out this video which shows how to get started with using GitHub (and Git) and Visual Studio together:

More resources:

Stretching your Dev/Test Dollar in Azure

Visual Studio Subscriptions, Dev/Test Subscriptions & More

You’ve got needs. Dev/Test needs. And access to an infinite world of power with Azure. How can you leverage that cloudly frontier for non-production work, i.e. Dev/Test?

There are typically two main scenarios when talking about Dev/Test in the cloud:

  • You want a sandbox to play in that won’t cost much or impact others. A place that you can use to try new things, spin up new services, and learn.
  • You need a shared sandbox for collaborative work (but not production), for trying new concepts or architectures, or simply don’t want to pay production prices for non-production environments.

In the basic sense, there are two options to best leverage your dollars in Azure for Dev/Test: Visual Studio Subscriptions (if you own them) and their Azure credit benefit, and the Enterprise Dev/Test Subscriptions (in Azure, not to be confused with a VS Subscription). Let’s take a walk through both approaches.

Visual Studio Subscriptions – Azure Credits Benefit

Your Visual Studio Subscription comes with monthly credits in Azure to use against your own, individual, Azure Subscription. Depending on your flavor of Visual Studio, you get different amounts of credits ($50/$100/$150), which reset monthly (this means that if you go over your credit limit, your resources simply shut off until your billing period restarts.

Monthly Azure credits as part of your Visual Studio Subscription benefit

If you haven’t already done so, you can activate your monthly credit benefit at https://my.visualstudio.com/. (see below)

https://my.visualstudio.com/

This monthly benefit gives you access to most Azure services for learning and speculative individual development, without bothering anyone.

  • Individual skills development
  • Experimenting in Azure
  • Light individual dev/test workloads
  • Prototyping, POC’s, “throw-away work”

That said, some organizations I’ve worked with in the past don’t like/allow these Visual Studio-based Azure subscriptions as they don’t initially have as much control. So, it’s best to check with your IT team, and your Visual Studio Subscriptions administrator (how to find/contact).

Enterprise Dev/Test Subscription (Azure)

If you have an Enterprise Agreement with Microsoft, the Enterprise Dev/Test Subscription may be a great option for you if you have Visual Studio Subscribers that want/need to collaborate in Azure in a non-prod environment.

At a high-level, the Enterprise Dev/Test Offer is:

  • For non-production workloads in Azure
  • Special, lower rates on Windows VMs, cloud services, SQL, HDInsight, App Service, Logic Apps. (Additional savings with Reservations)
  • Access to Dev/Test images (incl. Win 8.1/10)
  • Centralized management via Azure Enterprise Portal
  • Use funds on Enterprise Agreement/MACC (or PAYGO use Dev/Test Offer)

That said, it’s important to make a distinction between the Enterprise Dev/Test Azure subscription and a standard Enterprise Azure subscription:

Enterprise Dev/TestStandard Enterprise
Restricted to dev/test usage only
Can only be used by active VS subscribers
Includes dev/test images (Win 8.1 & 10)
Not covered by SLAs
Any/all workloads
Covered by SLAs

Main differences between Enterprise Dev/Test & standard Enterprise

Critical points for a Dev/Test subscription:

  • There is no financially-backed SLA
  • Any and all users of a Dev/Test subscription MUST have Visual Studio Subscriptions.

Sound like a good option? Your Azure Account Owner (w/ permission to do so from an Enterprise Administrator) can create a Dev/Test subscription for you.

Where do each fit in?

Let’s build this out visually.

Individual developer, learning, prototyping? Use the Azure credits as part of your Visual Studio Subscription.

Need to collaborate with your broader team in a non-prod environment? Go the Enterprise Dev/Test route.

Production workloads? Use your standard Enterprise Subscription (don’t forget your AHUB benefits to minimize costs!).

For further reading:

Azure Pipelines: Classic or YAML?

I get this question often: “We’ve been using Azure Pipelines for a while, and really enjoy the WYSIWYG experience of it for building pipeline definitions. But now there’s this YAML-based option as well and seems to be getting the most love. Should I switch? Or what should I use?

The short answer is “it depends” (isn’t that always the answer?).. it depends on whether you’re mid-flight with a project, starting a new one, and what your team’s tolerance is for making such a change.

Let’s do a quick comparison.

Classic vs. YAML

Classic

The classic editor for pipelines has been around since the inception of Azure Pipelines. It’s user friendly in that you simply add in the tasks you want from a visual list, fill in some blanks, and you’re done (I’m simplifying but you get the picture). It’s easy to see what a classic pipeline does, and troubleshooting is a breeze. Need to add a task? Simply search for it, click “add”, and that’s it!

Changes to pipeline definitions are versioned, but not in any useful DevOps way other than being able to see what’s changed. There is always one “live” version of a definition at any given time. So while versioned, you can’t do any kind of review, promotion/demotion, or branching/merging with a classic pipeline.

YAML

YAML on the other hand, isn’t much in the way of being visual. It’s markup/code based. So what? To answer that question, let’s look at what other code stuff you may have and why it’s represented as code.

You probably have application code, right? It defines what your app does. You store it (or should) in a version control repo somewhere. This provides a mechanism to track changes to your app, maintain different releases, and roll back when necessary.

You may have Infrastructure as code (IaC) as well. It defines the infrastructure on which an application will run. It’s also (or at least should be) stored in a repo somewhere, for the same purposes as your app code, but also because it allows you to evolve the infrastructure needs of the application along with the application itself – keeps everything in sync. With me so far?

Think of a YAML pipeline definition as “pipelines as code.” It’s a code-based definition of what your CI/CD workflow does when called upon. By managing your pipeline as code, you can version alongside both the app code and the infrastructure code, keeping everything in harmony. Need to branch your app code? What happens if you do that, and as part of the work you do in said branch the infra needs change? If you’re using IaC, you can modify that code as well and you’re good. But what if the changes also require changing the CI/CD workflow as well? YAML pipelines to the rescue. By keeping as many aspects of your DevOps assets and workflows in a repository as possible, you minimize the complications of deploying from branches, re-work upon merging a PR, etc.

Learning YAML & Migrating from Classic

The good news is that, although more of a learning curve than the classic experience, YAML isn’t crazy hard. You just need to understand the syntax and nuance.

If you’re currently using the classic experience, there are some helpers built into Azure Pipelines to assist you in making the switch to YAML, such as the “Export to YAML” feature as well as the “View as YAML” feature.

Learn more here:

Managing PII & PHI in Azure DevOps

Summary

As teams expand and deepen usage of Azure DevOps, there is the propensity for Personally Identifiable Information (PII) being introduced into work items (user stories, bugs, test cases, etc.). This can introduce liability and privacy issues for organizations. PII can creep in into Azure DevOps environments in the form of (may not be all inclusive):

  • Field values in user stories, bugs, tasks, etc. (description, acceptance criteria, title, and other HTML or text fields)
  • Test cases (title, description, test step descriptions, test step attachments)
  • Attachments to any work item type

While there is not an off-the-shelf solution to help with this There are ways to leverage Azure to develop such a utility to find, manage, and notify appropriately when PII exists in the Azure DevOps organization.

Approaches to Scanning

There are two main approaches to this problem. First, the team needs to find any PII that already exists in Azure DevOps and take necessary action. Second, the team needs a method to detect PII as close to real-time when it is first introduced.

Full Scan of Azure DevOps

To “catch up” and detect with PII that already exists in Azure DevOps, a comprehensive scan of Azure DevOps content is needed.

Below is a very high-level pseudo-code outline of performing such a scan. This scan takes into consideration all the aforementioned areas that PII could be present in Azure Boards or Azure Test Plans (the components Azure DevOps that leverage work items).

I also built a sample (just a sample, only a sample) here in GitHub.

Connect to Azure DevOps organization
Foreach (project in organization)
{
	Foreach (workItemType in project)
{
	Get all work items for the current workItemType
	Foreach (workItem in workItemsByType)
{
	Get all HTML and text field values that could contain PII
	Send HTML/text field values to Azure Text Analytics (in batch document)
	Foreach (valueWithPII in TextAnalyticsResults)
{
	Take some action (notification, redaction, removal)
}
Get attachments for the workItem
Foreach (attachment in workItemAttachments)
{
	Send attachment content (supported format) to Azure Computer Vision
	Send computerVisionResults to Azure Text Analytics
	Foreach (attachmentWithPII in AttachmentAnalyticsResults)
	{
		Take some action (notification, removal)
	}
}
If (workItemType is a Test Case)
{
	Get all values of each test step
Send test step values to Azure Text Analytics (in batch document)
Foreach (testStepWithPII in TestStepAnalyticsResults)
{
	Take some action (notification, redaction, removal)
}
Foreach (attachment in TestSteps)
{
	Send attachment content (supported format) to Azure Computer Vision
	Send computerVisionResults to Azure Text Analytics
	Foreach (attachmentWithPII in AttachmentAnalyticsResults)
	{
		Take some action (notification, removal)
	}

}
Get any test case parameters
Send test case parameters to Azure Text Analytics (in batch document)
Foreach (paramWithPII in TestParametersAnalyticsResults)
{
	Take some action (notification, removal)
}
}
}
}
}

This solution could also be used for a periodic scan if real-time/triggered scans are prohibitive.

Reference Documentation

Incremental or Triggered Scan

Moving forward, teams will need to detect the introduction of PII into Azure DevOps as soon as possible. There are a couple of approaches to this more incremental or trigger-based scan.

First, the solution developed in “Full Scan of Azure DevOps” could be utilized here as well, parameterized to check only the most recent items for a given interval. For example, if the scan is to run every hour, filter work item querying to return only items with a ChangedDate in the last 60 minutes.

Second, Azure Logic Apps could be used to trigger when work items are updated in Azure DevOps, providing detection within 1 minute of PII introduction. The Logic App would orchestrate the extraction of content to check, as well as any mitigation actions.

Below are a couple screenshots of basic examples of using a Logic App (steps are simplified for brevity).

Detect and Notify   This first example simply checks the description field of a work item for PII, using two Azure Functions (SanitizeString to remove HTML markup, and CheckPHIWorkItem to receive the text to check, call Azure Text Analytics, and format a response).   It is triggered by an update to a work item. When complete, it sends an email to the person who updated the work item.
Detect, Mitigate, and Notify   This second example avoids using Azure Functions for Text Analytics by using the built-in connector for Text Analytics.   It is triggered by an update to a work item. If the work item was updated by a human user, the Text Analytics connector is called. If PII is detected, the PII is copied to a file in Azure Blob Storage (where more RBAC controls are present), and either the redacted text or a violation notice is updated back into the work item.   When complete, it sends an email to the person who updated the work item.  

While there are Logic App connectors for Azure DevOps, Text Analytics, and Computer Vision, Azure Functions would provide more granular control (and also become more of a microservices architecture). Create Azure Functions to:

  • “Sanitize” HTML field values to plain text
  • Manage collation and interaction with Azure Text Analytics for text values
  • Manage OCR actions using Azure Computer Vision to extract text values from images and other attachments
  • Conduct PII replacement, redaction, or removal
  • Facilitate logging (to Azure storage, databases, or Azure Event Hubs)

Lastly, the “Full Scan” solution could be combined with the Azure Functions/microservices-style architecture to create more reusable components, allowing for easier updates, fixes, and scale. For example, create Functions for each of the above-bulleted capabilities, and leverage those Functions from the “Full Scan” solution as well as the “Incremental Scan” solution.

Azure Services Used

Below are Azure services that could potentially be used for this solution and are referenced in this document.

  • Azure Cognitive Services: Azure Cognitive Services are cloud-based services with REST APIs and client library SDKs available to help you build cognitive intelligence into your applications. You can add cognitive features to your applications without having artificial intelligence (AI) or data science skills. Azure Cognitive Services comprise various AI services that enable you to build cognitive solutions that can see, hear, speak, understand, and even make decisions.
    • Text Analytics: The Text Analytics API is a cloud-based service that provides Natural Language Processing (NLP) features for text mining and text analysis, including: sentiment analysis, opinion mining, key phrase extraction, language detection, and named entity recognition.
    • Computer Vision: The cloud-based Computer Vision API provides developers with access to advanced algorithms for processing images and returning information. By uploading an image or specifying an image URL, Microsoft Computer Vision algorithms can analyze visual content in different ways based on inputs and user choices. Learn how to analyze visual content in different ways with quickstarts, tutorials, and samples.
  • Azure Logic Apps: Azure Logic Apps is a cloud-based platform for creating and running automated workflows that integrate your apps, data, services, and systems.
  • Azure Functions: Azure Functions is a serverless solution that allows you to write less code, maintain less infrastructure, and save on costs. Instead of worrying about deploying and maintaining servers, the cloud infrastructure provides all the up-to-date resources needed to keep your applications running. You focus on the pieces of code that matter most to you, and Azure Functions handles the rest.
  • Azure Blob Storage: Azure Blob storage is Microsoft’s object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data.
  • Azure Event Hubs: Azure Event Hubs is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters.

Development Considerations

While the conceptual approach to scanning Azure DevOps is straightforward, there are programming considerations to discuss if wanting to complete a comprehensive scan of Azure DevOps work items. These are a few that I discovered during my research.

  • Dealing with attachments: Checking for PII in an attachment requires additional steps, depending on the file format.
    • Text-based (.txt, .json, .xml, .html, etc.): These can have their content streamed to memory as text. The Text Analytics API can then be streamed the content.
    • Binary (.jpg, .doc, .pdf, .png, etc.): If the format is a supported one for the Read API, the URL of the attachment can be provided to the Read API directly (if the identity used to run the Cognitive Services resource has access to Azure DevOps) using the attachment URL. Otherwise, these attachments will need to be downloaded as well. Depending on the file type, additional methods will need to be used to get the content into an accepted file format for the OCR features in Azure Computer Vision (using the Read API).
      • The Read API has the following input requirements:
        • Supported file formats: JPEG, PNG, BMP, PDF, and TIFF
        • For PDF and TIFF files, up to 2000 pages (only first two pages for the free tier) are processed.
        • The file size must be less than 50 MB (6 MB for the free tier) and dimensions at least 50 x 50 pixels and at most 10000 x 10000 pixels.
  • As documented in the pseudo-code for the full scan approach, an additional check and loop is needed to iterate test steps in a Test Case. For any attachments on a test step, the above “dealing with attachments” considerations also apply.
  • An individual Logic Apps can only be triggered by changes to a single project (the trigger action can be bound to only one Azure DevOps project).
  • Work Items: API limits work item query results to 200.
    • May need to build narrower queries, such as iteratively “walk back” by querying for items in last 1 day, then 2 days, etc. (for example)
    • OData feeds support more than 200 results, but don’t include text-based fields. Additional calls would have to be incorporated.

Actions Upon Detection

Regardless of the approach used to detect PII, the actions taken upon detection are most important. What to do depends on urgency, compliance, and trust.

Logging

Simply logging the detection may be good enough if proper reporting is all that is needed. Sending the detection event to Azure Event Hubs or Azure Event Grid provides an avenue for the event to be recorded in an analytics workspace, or analysis engine.

Notification

Notification can involve several methods:

  • Email to the user introducing the PII, that person’s manager, or a compliance team.
  • Post a message to Microsoft Teams.
  • Place a tag on the work item to draw attention to it.

Mitigation

Mitigation involves taking direct action on the PII content. During this exercise, several options presented themselves. For example, if the following text was detected in the description field of a work item:

Parker Doe has repaid all of their loans as of 2020-04-25. Their SSN is 859-98-0987. To contact them, use their phone number 800-102-1100.

  • PII deletion: Delete the PII content and save the work item.
  • PII redaction: The content can be replaced with its redacted equivalent (Azure Text Analytics provides redaction automatically): ********** has repaid all of their loans as of **********. Their SSN is ***********. To contact them, use their phone number ************.
  • Secure the PII: Move the PII content to a location that has proper RBAC, such as Azure Blob Storage. Replace the description field with a notice of the PII detection and the blob URL. Only those with RBAC access will be able to view it.

Associated Costs

Specific costs of this solution depend heavily on overall volume: Frequency of execution, API calls, # of records checked, etc. Below is reference pricing for each service that could be utilized.

Logic Apps

Pricing page

  • Price Per Execution: $0.000025 (First 4,000 actions free)
  • Standard Connector: $0.000125 (Azure DevOps, Text Analytics)

Text Analytics

Pricing page

  • 0-500,000 text records — $1 per 1,000 text records
  • 0.5M-2.5M text records — $0.75 per 1,000 text records
  • 2.5M-10.0M text records — $0.30 per 1,000 text records
  • 10M+ text records — $0.25 per 1,000 text records

Text record: request up to 1,000 characters

Computer Vision

Pricing page

  • OCR pricing table
    • 0-1M transactions — $1 per 1,000 transactions
    • 1M-10M transactions — $0.65 per 1,000 transactions
    • 10M-100M transactions — $0.60 per 1,000 transactions
    • 100M+ transactions — $0.40 per 1,000 transactions

Disclaimer

The information in this document is designed to be a sample reference for the solution described. It does not imply an enterprise-ready solution architecture, nor represent a commitment to fully design and build the described solution.

Azure DevOps Licensing Primer

Licensing Azure DevOps shouldn’t be scary. Let me break it down for you.

Licensing Azure DevOps shouldn’t be scary. Let me break it down for you.

Azure DevOps Services (cloud)

Don’t overthink how to license & pay for Azure DevOps. It’s more straightforward than you think, for 2 reasons:

  1. Costs you see on the Azure Pricing Calculator are all you need to worry about. Azure DevOps doesn’t charge additional fees under the covers such as Azure Storage, database, etc. WYSIWYG, basically.
  2. Did you over- or under- purchase? Azure DevOps is billed via an Azure Subscription monthly, which means you can dial up and down what you need from Azure DevOps each month.

For Azure DevOps Services (i.e. cloud-hosted), user licensing is pretty straightforward:

  • Basic: $6/user/month
  • Basic + Test Plans: $52/user/month

A few considerations which can reduce overall monthly cost:

  • Azure DevOps Services comes with 5 free Basic users included.
  • A Visual Studio Professional subscription includes a Basic license for no additional cost.
  • A Visual Studio Enterprise subscription includes a Basic + Test Plan license for no additional cost.
  • An MSDN Platforms subscription includes a Basic + Test Plan license for no additional cost.
  • A Visual Studio Test Professional subscription includes a Basic + Test Plan license for no additional cost.
  • Stakeholder users (comparison 1/2 down this page) are free.

Note: Azure DevOps Basic User licenses grant rights to use both Azure DevOps Services (cloud-hosted) AND Azure DevOps Server (on-prem). Azure DevOps Server (on-prem) CALs grant rights to Azure DevOps Server (on-prem) only.

Other components of Azure DevOps

  • Azure Pipelines – includes 1 Microsoft-hosted agent and 1 self-hosted agent. (How to choose)
    • Each additional Microsoft-hosted agent/pipeline: $40/month
    • Each additional self-hosted agent/pipeline: $15/month
  • Azure Artifacts – First 2 GB free, then progressive pricing based on usage:
    • 0 – 2 GB = Free
    • 2 – 10 GB = $2 per GB
    • 10 – 100 GB = $1 per GB
    • 100 – 1,000 GB = $0.50 per GB
    • 1,000+ GB = $0.25 per GB

(You can also set Azure Artifacts to limit your feeds to 2GiB to prevent any charges)

Pricing page: https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/

Azure DevOps Server (on-premises)

For Azure DevOps Server (on-premises), there is a server license, but is included with any active Visual Studio subscription (so doubtful you’d need to purchase one individually). Any active Visual Studio subscription includes both a server license and a client access license (CAL) for Azure DevOps Server.

More information: