Custom Vision with PowerShell

Custom Vision is service for creating computer vision models that can be interacted with via a REST API. Custom Vision is powered by Cognitive Services.

Here is a simple demo using PowerShell to determine whether a flag is from New Zealand or Australia. These flags have a lot of similarities, including the Union Jack, blue back ground and Stars, so I thought it would be a good test of the technology.

The first task is to create a project on https://customvision.ai, upload images of flags and train the model. In this case I have added two tags, New Zealand and Australia. The minimum number of photos per tag is 5. Choose a variety of angles and shapes.


Once the images have been uploaded, choose Train from the menu to build the model.

Next, you will need to get the API URL and Prediction Key from the Performance tab.


Now for the PowerShell bit. In this example, we call the Prediction API with the URL to a flag image.

Replace the customVisionAPIURI and Key with the values from your model. Replace the ImageURL with the URL of a flag.

$CustomVisionAPIURI = “https://southcentralus.api.cognitive.microsoft.com/customvision/v1.0/Prediction/00e38c62-acfb-4b30-9d04-481b3ea5436b/url?iterationId=ebe48241-a269-4a30-9223-c80d335462ce”
$Key = “51d70eb8f69044dd89408ae4e60b01da”

$imageURL = “http://images.all-free-download.com/images/graphicthumb/australian_flag_312448.jpg”

$bod = @{url = $imageUrl };
$jsbod = ConvertTo-Json $bod
$Result = Invoke-RestMethod -Method Post -Uri $CustomVisionAPIURI -Header @{ “Prediction-Key” = $Key } -Body $jsbod -ContentType “application/json” -ErrorAction Stop

The results from the script:

TagId Tag Probability
—– — ———–
61740b10-a791-43a9-b6ae-edd2559431a8 Australia 1.0
3ecac738-a7d8-451d-98b9-7d5b36b20add New Zealand 1.67903181E-07

That’s it!

A practical application of this technology could be classifying photos in SharePoint. The script could be extended to run across a library of photos and populate data into a metadata column to classify the image. For example a library containing photos of different types of cattle, could be processed to determine if they are Angus, Jersey, Friesian, Hereford etc.

There are many more advanced applications of Cognitive Services such as medical image analysis, quality processes, identifying faces etc.

More technical detail on CustomVision API can be found here: https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/




PowerShell:Bulk load files into SharePoints

Here is a script I wrote to bulk upload files and metadata into SharePoint. To make this work you need two things, a CSV file containing the names of the files to upload and the metadata associated with the item.

In this example, the CSV file has the following format:

  • filename,cust_number,document_type

The script reads the CSV file, creates a folder in the document library named with the value of the “cust_number” field, and then uploads the file “filename” and populates the “document_type” column.

The WebClient command is used to upload the file into the document library. The script also checks the item in (if required).

Write-Progress -Activity “Connecting to SharePoint Site,” -Status “Please wait …”
Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue

$CSVFile = “C:\FilesToImport\filelist.csv”

$SPWebURL = “http://sharepoint/site”
$SPListURL = “http://sharepoint/site/library/”
$BaseFolder = “C:\FilesToImport\Files”
$Credentials = [System.Net.CredentialCache]::DefaultCredentials

$SPWebObject = Get-SPWeb $SPWebURL
write-host $SPListURL
$SPListObject = $SPWebObject.GetListFromUrl(“library/Forms/AllItems.aspx”)
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = $Credentials

Write-Progress -Activity “Importing CSV File,” -Status “Please wait …”

$CSVObject = Import-CSV $CSVFile
$Index = 0
$Max = $CSVObject.Count

ForEach($CSVItem in $CSVObject)
Write-Progress -Activity “Updating Metadata” -Status “Processing Item $Index of $Max”

$FileName = $CSVItem.File_name + “.pdf”
$ID_Number = $CSVItem.Cust_Number
$DocumentType = $CSVItem.Document_Type

$FullFileName = $BaseFolder + “\” + $FileName
write-host $FullFileName
if (Test-Path ($FullFileName))
$UploadPath = $SPListUrl + “/” + $Cust_Number + “/” + $FileName
$WebClient.UploadFile($UploadPath, “PUT”, $FullFileName)$SPListItemsObject = $SPListObject.Items | where {$_[‘Name’] -eq $FileName}
ForEach($SPListItem in $SPListItemsObject)
$SPListItem[‘Document_Type’] = $DocumentType

if ($SPListItem.file.CheckOutStatus -ne “None”)
Add-Content ErrorLog.txt $FullFileName

I’ve used this script in a few scenarios. I hope you find it useful too.

SharePoint Initialize-SPResourceSecurity Attempted to perform an unauthorized operation

Here’s a tricky little issue I struck, so I thought I’d share the solution. In my situation I was installing SharePoint 2013 Server on Windows Server 2012.

I built a new SharePoint environment using AutoSPInstaller. The installer PowerShell script threw an error but continued running until it got to the provisioning Enterprise Search step. At that point I get an endless number of blue dots appeared in the script and it never ends………………………………………

Initialize-SPResourceSecurity : Attempted to perform an unauthorized operation.

I tried running Initialize-SPResourceSecurity from the SharePoint Admin PowerShell (run as Administrator). I received the same error as above.

I thought the issue was with AutoSPInstaller, so I tried running PSConfig only to have it fail with a similar error. The PSConfig log file contained an error with the same description as the one above. It included an additional line indicating the issue was a file system permission problem. This lead me to the solution below.


The solution was to change the “Local Administrators” group permission on the C:\Windows\Tasks folder from Modify to Full Control.

Rerun the AutoSPInstaller or PSConfig and the error should no longer occur.

I hope this helps the next person who strikes this problem!

SharePoint PowerShell Locate Libraries with specific Columns

Here’s a little PowerShell Script I wrote to list all Document Libraries that contain a specific Column. I have specifically checked to see if the BaseTemplate is a library, however you can remove that part of the script and look for all Lists or a specific List type e.g. announcements or calendar.

It can be easily modified to display other details using these documents as a reference:

SharePoint List Properties: http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.client.list_properties.aspx

SharePoint Field Properties: http://msdn.microsoft.com/en-us/library/aa544201.aspx

 $s = Get-SPSite http://sharepoint
 $wc = $s.AllWebs
 foreach($w in $wc){
   foreach($l in $w.Lists){
    if($l.BaseTemplate -eq "DocumentLibrary"){
      for ($i = 0; $i -lt $l.fields.Count; $i++)
        if($l.fields[$i].Title -eq $fieldname){
          write-host $l.ParentWebURL,",",$l.Title,",", $l.fields[$i].Title,",",$l.ID

Another useful script is this one to export a list of Site Columns and Associated Lists:

$web = Get-SPWeb http://sharepoint
$column = $web.Fields[$fieldname]

SharePoint Powershell List Column Queries

I recently had the need to locate all list columns in a SharePoint site of a particular data type. My first thought was someone must have done this before and after a quick hunt around in Google discovered that it wasn’t the case.

I wrote a simple script to output the names of lists and fields of a particular type using PowerShell. It is easy to modify the script to look for other field types or to change the field configuration e.g. switching from plain text to rich text.
This MSDN Article contains details of the “SPField Class” and is very handy if you are looking to create a variation of my script.

A small word of warning…always test this in a test environment before running in production. Use this script at your own risk!

Here is my base script:

$SPwebApp = Get-SPWebApplication "http://myintranetURL"
 foreach ($SPsite in $SPwebApp.Sites)
  foreach($SPweb in $SPsite.AllWebs)
   write-host "Site URL: ", $spweb.url
   foreach($list in $spweb.lists)
     write-host "List Name: ", $list.title
      foreach($field in $list.fields)
        if ($field.type -eq "Note" -and $field.title -ne "Approver Comments")
           write-host "field title: ", $field.title, " > ", $field.type -nonewline
           if ($field.richtextmode -eq "FullHTML")
              write-host "* Rich Text Enhanced ** " -nonewline
          write-host ""

This scripts walks its way through every list in a given web application and outputs a list of “sites, lists and field names where the field type = “Note” (multi-line text).

Works on both SharePoint 2010 and SharePoint 2013.

SharePoint 2010 and Forefront TMG

Ever wanted to published SharePoint 2010 externally and found it difficult to understand and even harder to find good documentation? I’ve worked on this particular issue several times in the past few months, so thought it was time to put fingers to keyboard and provide a few tips.

These notes cover publishing SharePoint 2010 with either ISA 2006 or Forefront TMG.

Before you begin:

Commonly your internal SharePoint farm will be accessed over HTTP whilst external access is via HTTPS.
In this example I will use the following configuration:

SharePoint URL: http://sharepoint.domain.local

MySites URL: http://mysites.domain.local

Wildcard digital certificate: *.internetdomain.com

Two external DNS records pointing to the same external IP address on the ISA server:
• SharePoint.internetdomain.com
• Mysites.internetdomain.com

SharePoint Steps:
1. Extend the SharePoint and MySites web applications (in Central Admin)
2. Install your digital certificate (and root certificate) on the Web Front End Server
3. Using PowerShell add two Alternative Access Mappings (AAM’s):

4. In IIS edit the binding on the Extended web application – change from HTTP to HTTPS and select the certificate above. Once done remove the HTTP (listening on port 443) binding, this isn’t needed.
5. Make sure the new sites have started an IISReset may be required.

Forefront TMG or ISA Server Steps:
1. Create a web listener

  • Redirect HTTP to HTTPS
  • Use the same certificate installed on SharePoint above
  • Configure SSO = .internetdomain.com (this ensures only one login to TMG or ISA is required for all sites on that listener with matching domains)

2. Create two publishing rules, one for SharePoint and the other for MySites

  • Use the same web listener for both
  • Forward the original host headers
  • Bridge the connection using HTTPS (keep the protocols the same between the external URL and the internal URL)

In some instances you may need to create translation rules for HTTP to HTTPS. This can be done on the publishing rule.

Access rules can be used to block access to specific sub-URL’s.