Using Text Analytics Key Phrase Cognitive Services API from PowerShell

There are abundant sample for executing Text Analytics Cognitive Services API from C# and other languages like node.js. While searching, I did not find any examples of consuming Text Analytics API through Powershell and this blog is all about it.

In this blog post, I am going to show how to use Text Analytics Key Phrase Cognitive Services API to extract key phrases from a given sentence or paragraph. Cognitive Services are REST api that can be invoked by any language and from any platform. They are build using industry standards and message exchange happens through JSON payloads.

It is important to understand that Cognitive Services are services provided as PaaS service from Azure. You need a valid Azure subscription and need to provision Cognitive Services resource in a resource group. While provisioning this resource, Text Analytics API service should be chosen as API type. Text Analytics API service contains a set of REST api and one of them is related to Key Phrase extraction. The same has been shown in Figure 1.

Cognitive Service Text Analytics

After the Service is provisioned, it generates a set of unique key associated with the service. Any client that wants to invoke and consume this instance of cognitive Services should send this Key with request. The Service will validate the key and if it matches the key it holds will allow successful execution of the request.

Now that the service is provisioned, it’s time to write the client using Powershell.

 

Open your favorite Powershell console and write the script shown next. The code is quite simple and few statements.

# uri of the KeyPhrases api related to OCR

$keyPhraseURI = “https://westus.api.cognitive.microsoft.com/text/analytics/v2.0/keyPhrases”

# key to identify a valid request. You should provide your own key

$apiKey = “xxxxxxxxxxxxxxxxxxxxxxx”

# preparing JSON document as message payload

$documents = @()

$message = @{“language” = “en”; “id” = “1”; “text” = “I had a wonderful experience! The rooms were wonderful and the staff were helpful.” };

$documents += $message

$final = @{documents = $documents}

$messagePayload = ConvertTo-Json $final

# invoking Key Phrase rest api

$result = Invoke-RestMethod -Method Post -Uri $keyPhraseURI -Header @{ “Ocp-Apim-Subscription-Key” = $apiKey } -Body $jsbod -ContentType “application/json” -ErrorAction Stop

The code is well commented but to understand it better the first line declares a variable to hold the key required for identifying with Cognitive Services Identity provider. You should provide your own key. The url to Text Analytics Key Phrase REST Api.

Next set of statements are preparing the JSON message payload that should be passed to the REST api as part of request body. A hashtable is declared containing language, id and text key value pairs. It is converted into JSON format and the last line invokes the REST api using Invoke-RestMethod cmdlet passing in the Uri, header containing custom item, the body and content type. It is important that header must contain Ocp-Apim-Subscription-Key custom header with API key as it value. The request will fail if this header is missing or it contains invalid key.

The response object is a JSON object containing the text extracted by Text Analytics service.

Executing $result.documents.keyPhrases on the console will return the text extracted by Text Analytics service as shown next

PS C:\Users\rimodi> $result.documents.keyPhrases

staff

wonderful experience

rooms

Hope you liked the blog post. Please send your feedback and if you would like to stay connected, you can connect through twitter @automationnext and LinkedIn @ https://www.linkedin.com/in/ritesh-modi/

Happy coding and Cheers!

 

Advertisements

Desired State Configuration: Setting up of Pull Server

Below is a small excerpt from my book on Powershell Desired State configuration.

There are couple of ways to setup the Pull Server. Microsoft has released a resource of type xDSCWebService that can be used for deploy the Pull server artifacts, create and configure IIS endpoints. One way is to use xDSCWebService resource type in a configuration to create a Pull Server. Another way is to create the pull server manually step by step. The manual way to configure the pull server is definitely more involving and error prone however, errors can be reduced by automating the process of creating the pull server and at the same time provides more control to every aspect of the configuration.

In this section, we will use the second approach and build the entire pull server from scratch bare metal.

There are a sequence of steps that needs to be undertaken to create a Pull Server. We will create the entire pull server using powershell such that the powershell script can be reused to re-create pull servers on need basis.

  1. The first step in creating a Pull server is to login on to the Server that we want to create it as Pull server. Open Server Manager and goto Add Roles and Features. In Windows Server 2012 R2, a new sub-feature named “Windows Powershell Desired State Configuration Service” is available under “Windows Powershell” feature. We have to enable this feature.

    Figure2 Enabling DSC Service

    Before we enable the sub-feature “Windows Powershell Desired State Configuration Service”, there are certain pre-requisite features that needs to be enabled. These includes

    1. Web Server
    2. Windows Powershell
    3. .Net Framework 3.5 and 4.5 features
    4. Management OData IIS Extension

    To enable the above mentioned pre-requisite and “Windows Powershell Desired State Configuration Service” in powershell is shown in Image 3.

    Figure3 Enable features powershell

    In figure 3, we have declared few variables to hold the internal names of all the Windows features. We have also created a variable to hold the computer name. This would help in creating generic script. We will use this variable instead of hardcoded computer name. Install-WindowsFeature cmdlet is used to enable the features. This includes the “DSC-Service” that provides all the Pull server artifacts.

    Enabling this feature deploys multiple files on the file system especially a folder is deployed named “PullServer” under “C:\Windows\System32\WindowsPowerShell\v1.0\Modules\PSDesiredStateConfiguration” folder. This folder contains the Desired State Configuration module which in turn contains all the DSC cmdlets we have been using so far. We will use these files to create both a pull as well as a compliance server. We will look into compliance server in the next chapter.

    Figure4 file for pull server

  2. DSC Pull server is essentially an IIS Website that accepts configuration requests and responds with configuration details. We will have to build IIS website that will act as DSC Pull server endpoint. It then follows that we will be needing a physical directory mapped to IIS website. An Application pool would be needed to run the website and configure them with appropriate values. In Powershell, we would be created few more variables to hold these values shown in Figure 5. These variables refer to website physical directory path “C:\PSDSCPullServer”, a sub-directory within named “bin” needed by web applications storing assemblies, the name of the application pool “DSCPullServer” to be created and associated with the website, “4.0” as application pool dotnet version, “DSCPullServer” as name of the website and port 8080 as website port number.

    Figure5 variables

  3. Create the physical folder that would be mapped to Website virtual path. In Windows explorer, navigate to C:\ and create a new folder named “C:\PSDSCPullServer”. Also, within this new folder, we create a sub folder named “Bin”. We have choosen here to create a folder directly under C:\. The readers are free to choose any valid file system location. If the location is changed, powershell variables should also be changed appropriately to reflect the change.

    To achieve the same through Powershell, we should execute the command in Image 6.

    Figure6 Folders

    Figure7 PullServer

  4. The next step is to copy the Pull server IIS endpoint related file to the PSDSCPullServer directory and bin folder just created. Through Windows explorer copy the “Microsoft.Powershell.DesiredStateConfiguration.Service.dll” from “C:\Windows\System32\WindowsPowerShell\v1.0\Modules\PSDesiredStateConfiguration\PullServer” to “C:\PSDSCPullserver\bin” folder. This assembly contains the core functionality and functioning of DSC Pull Server.

    Also, copy “Global.asax”, “PSDSCPullServer.mof”, and “PSDSCPullServer.svc”, “PSDSCPullServer.xml” and “PSDSCPullServer.config” files from “C:\Windows\System32\WindowsPowerShell\v1.0\Modules\PSDesiredStateConfiguration\PullServer” to “C:\PSDSCPullServer” folder. These are core asp.net files needed for a functioning asp.net web application.

    The “PSDSCPullServer.config” is actually the web.config file for Pull server website endpoint and should also be renamed to web.config after the copy operation.
    To achieve the same through Powershell, we should execute the command in Image 8.

    Figure8 copying file

    Figure9 Website files

  5. Now, it’s time to create IIS Website and application pool. To create IIS website open Internet Information Service Manager and create a new Application Pool named “DSCPullServer”.

    Figure10 IIS apppools

    Select newly created DSCPullServer application pool and click on advance settings. Within Advance settings, change the identity from “ApplicationPoolIdentity” to “LocalSystem”.

    Figure11 IIS settings

    After the Application Pool is created, it’s time to create the website.

    Click on Sites in IIS Manager and then Add Websites. In the resultant dialog, provide “DSCPullServer” as Website Name. Also choose the earlier created “DSCPullServer” Applocation pool. Provide the physical path “C:\PSDSCPullServer” created earlier. Provide the Port number as 8080. Note that these values are all configurable and the reader can provide their own values.

    Figure12 website creation

    To achieve the same through Powershell, we should execute the commands in Image 13. First, we import the WebAdministration module that provides all the functionality related to IIS.

    It provides a provider “IIS:” through which we can verify whether an application pool and web site exists, If not, it creates a new Application pool and website and related them together. It also configures both application pool and website with relevant information.

    Figure13 Powershell for website

  6. After setting up IIS Website and Application Pool, we have to unlock some of the sections of the web.config file such that they can be changed and updated. By default the parent configuration locks the authentication section. Unlock will help us change the authentication setting in web.config. There is a utility command line tool named “Appcmd” that can be used to manage IIS functionality. We will use this appcmd tool to unlock the web.config sections and also change the Identity of the application Pool.

    This action should only be done through powershell unless we want to change the IIS configuration files manually by hand as shown in Figure 14. $appcmd refers to the command line executable and this executable executes unlock configuration command for authentication section of web.config. It also sets apppool identity to LocalSystem.

    Figure14 Unlocking webconfig sections

  7. In Pull Mode, DSC stores all the configuration information and the target Nodes information in a Database called Devices.mdb. This file is also installed when the DSC-Service was enabled. There is another folder “DSCService” created while enabling “DSC-Service” at folder location “C:\Program Files\WindowsPowershell\”. We will copy and place the Devices.mdb file within the “DscService” folder.

    To achieve the same through Powershell, we should execute the commands in Image 15.

    Figure15 database file

  8. As last step, we need to modify the web.config file for the website to reflect the changes regarding

    a. The new location of Devices.mdb database and also add connectionstring to the database such that DSC service can use this database for its internal management purpose.

    b. The “DscService” folder created in last step also contains two folders named “Configuration” and “Modules”. All Configuration MOF files related to Pull Server should be placed in “Configuration” folder such that Pull server website can access it. The “Modules” should contain all the custom DSC resources and composite configurations. We will cover custom DSC resources and composite configurations in subsequent chapters. The web.config file needs this path to these folders to accessing it.

    c. The database provider to be used for accessing the devices.mdb database file.
    All the changes of Web.config are additions to the section of web.config. This activity can be done manually by typing and adding entries to the AppSettings section as shown in Figure 16.

    Figure16 Webconfig

    As a result, the entire appsettings section in web.config should look like as shown in Image 17.

    Figure17 Web config additions

    To achieve the same through Powershell, we should execute the commands in Image 18. Web.config is essentially an xml file and using xml capabilities of powershell we create a node representing a new key-value section and add then to the appsettings section. Finally, we save the web.config file such that the changes can take effect.

    Figure18 Powershell web config changes

  9. Now, if you have following this chapter exactly as mentioned, you can browse to http://localhost/PSDSCPullServer:8080/PSDSCPullServer.svc through internet explorer and the resultant response should look as shown in image 19. It shows the response from the web server with couple of entities named “Action” and “Module”. We will understand these details in subsequent section in this chapter.

    Figure19 Pull server test

Hope you enjoyed this post and happy pulling DSC configs!!

Cheers!

Azure Automation: WHO AM I

Azure Automation is gaining lot of popularity and it is quite a black box. You author a runbook, publish it and run it. Do you know on which machine the runbook is getting executed and the name of the user account used to execute the runbook?

The answer is that we can figure out!!

You can author a runbook in Azure automation as shown below and execute it. It is output the name of the local machine as well as the user under which the runbook is getting executed.


workflow Whoami
{
inlinescript{
$WindowsID = [System.Security.Principal.WindowsIdentity]::GetCurrent()

$WindowsPrincipal = New-Object System.Security.Principal.WindowsPrincipal($WindowsID)

$WindowsPrincipal.Identity

}
}

This would print the details about the identity under which the runbook was executing as well as the host or computer name. The result or output would look like below.


IsAnonymous : False
Name : LsaSetupDomain\Administrator
Owner : S-1-5-21-3235083057-2096672557-690217298-500
User : S-1-5-21-3235083057-2096672557-690217298-500
Groups : {S-1-5-21-3235083057-2096672557-690217298-513, S-1-1-0, S-1-5-114, S-1-5-32-544...}
Token : 1012
Claims : {}
Actor :
BootstrapContext :
Label :
NameClaimType : http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name
RoleClaimType : http://schemas.microsoft.com/ws/2008/06/identity/claims/groupsid

The entire thing would like below

AA identity

Note the name shown in yellow. It is LsaSetupDomain\Admninistraor.

Hope you enjoyed this post!

Cheers!

LCM Enumerations in WMF 5.0 February preview release

 

Local Configuration manager in WMF 5.0 February preview release has introduced newer properties along with additional configuration values.

Below are updated and latest configuration values for some of its properties.

 

RefreshMode

“Push” and “Pull” had been the applicable values for RefreshMode property since beginning. Now an additional “Disabled” has been added to disable local configuration manager to not apply configuration to a target Node

 

ConfigurationMode

There is no change in the ConfigurationMode property. The applicable value are same as in previous version – “ApplyOnly”, “ApplyAndMonitor” and “ApplyandAutoCorrect”.

 

ActionAfterReboot

This is a new property that determines the action Local configuration manager should undertake after a target node restarts as a result of applying a configuration. The applicable values are “ContinueConfiguration” and “StopConfiguration”

 

DebugMode

DebugMode has added newer set of values. Earlier DebugMode used to accept boolean values – either “true” or “false” determining if the resource modules should be cached or not. It has gone complete change and now it accepts four different values instead of boolean true or false.

None” signifies that DebugMode is False and not applicable.

ForceModuleImport” enforce the resource module to be reloaded instead of using the cache. This is similar to “true” value in previous versions.

ResourceScriptBrealAll” helps in debugging DSC resources when Local configuration manager tries to execute their functions. More on it in subsequent blog posts!

All” signifies that debugging as well as reloading of modules are both enabled.

 

LCMState

LCMState determines the current state of LCM, whether it is busy executing and applying configurations or is waiting for a node reboot to happen or doing nothing. Each of these three states are represented through three different values. “Busy” while LCM is working on the target node, “PendingReboot” while LCM is waiting for node to restart and “Ready” when LCM is doing nothing. It could be waiting for commands to apply configurations or pull configuration or reapply apply configuration.

We will delve deeper into these configurations in subsequent blog posts!

Stay tuned!

Cheers!!

Powershell cmdlets for Azure Automation now available!!

The latest Powershell version for Azure has added 43 new cmdlets to manage Azure Automation.

After downloading the latest version of Azure Powershell check the build number containing these cmdlets. It should be 0.8.14.

You get the version number of Azure module by executing the following command.aa0

You can also execute the below command to get all the available cmdlets belonging to Azure Automation.

aa1

The cmdlets primarily belongs to Azure Automation Assets like Connection, Credential, Variables, and Schedules. It also has cmdlets for managing Accounts and Runbooks.

We will deep-dive into each of these cmdlets and Azure Automation to know how they work.

Stay tuned!!

Cheers to Azure Automation !!