Wednesday, December 12, 2018

Unpredictable and therefore unreliable crawling of User Profiles in SharePoint Online

In many organizations that utilize SharePoint, the Human Resources (HR) administration is an (one of the) origin from which data is propagated into the User Profiles. The presence of HR data in SharePoint User Profiles then allows to retrieve that data for usage in SharePoint API based applications, e.g. JavaScript (vanilla upto SPFx), InfoPath, SharePoint Workflows, Microsoft Flow, ... An issue here is that in case of custom User Profile properties, the User Profile REST service does not support to return their value: Query SharePoint user profile by custom property. Approach to deal with that lack is to retrieve/query the custom User Profile properties via User Profile Search: Query and filter user profiles based on the value of customer property using REST.
Consequence of retrieving via Search is that the returned data may not be the actual value, but is as fresh as the latest crawling of User Profiles as search source. In case of an on-premise deployment, an organization has self-control over the crawling rate; default setting is a full crawl per 15 minutes. Implication is that typically the data cached in the SharePoint Search Index is not older / outdated than maximum of 15 minutes. This is for the most business usages acceptable, as HR data is not updated continuously. However, in case of SharePoint Online usage, matters are different. Microsoft makes no concrete statement / promise on the period it takes to have changed User Profile values become visible in SharePoint Search. On experimentation, I observed that for one User Profile instance a change in custom property was visible + retrievable via Managed Property within 30 minutes, while for another User Profile instance a change in the same custom property was not picked up yet after a period of 3 days! This unpredictability makes the utilization of User Profile Search to retrieve User Profile property values less reliable.
I've consulted Microsoft Support on this topic. In a nutshell the response is that my observations is the current way of working, and that Microsoft cannot give an hard statement on the maximum duration in which User Profile property changes will be visible in SharePoint Search. And current in SharePoint Online context, one does not have the option to explicit request for reindex of User Profiles via full crawl. Microsoft themselves refers to a SharePoint UserVoice request to give your vote: Reindex User Profiles option.
Mikael Svenson posted a workaround that allows you to get "kinda explicit" control: How to trigger re-indexing of updated user profiles in SharePoint Online. Although for multiple reasons certainly not the preferred way, until Microsoft delivers on the UserVoice raised request, this approach is likely the best available to get changes in User Profile instances at least within a delineated elapse time visible through SharePoint search.

Friday, November 30, 2018

Beware: using Asset library for video capability failure due missing feature activation

The out-of-the-box Asset library can be used as a (very) basic video capability. But be aware that for this to work flawless, it is required that the site collection feature “Video and Rich Media” is activated. If not active, video files can still be stored in the library, but the video functionality is broken and useless. Most important symptom is that the video files can not be played from the library context, "<asset library>/forms/videos/videoplayerpage.aspx" is not available (HTTP 404). In addition, when you inspect uploaded video files, you will observe that although their library content type is “Video”, their parent Site ContentType is “Document Collection Folder” instead of “Video”:
Incomplete content type chainCorrect content type chain
Activating the “Video and Rich Media” feature corrects the content type chain, but already administrated videos are not automatically fixed. Moreover, already provisioned Asset libraries typical remain behind in inconsistent state:
  1. The “Video” content type on library level still misses the essential site columns of “Video”
    Video file in Asset library with incomplete Video content type
    Video file in Asset library with complete Video content type
  2. Uploading video files even results in SharePoint throwing an error.
Best is therefore to upfront of creating an Asset library in which you aim to (potential) administer videos, check whether the site collection feature is active. And in case you find out that the feature was not active after creation of an Asset library, take the pain / effort and provision a new Asset library, and redo the videos upload in that new library.

Tuesday, November 27, 2018

Beware: Azure AD B2B guests inconsistent resolved in PeoplePicker

When you employ Azure AD B2B for external sharing of SharePoint Online sites, you’re likely to encounter a peculiarity in the usage of the PeoplePicker: trying to resolve guests by typing in their external email as identity, typically fails if this guest identity was not resolved before in this site. The workaround to force PeoplePicker to resolve the guest is copy/paste in 1x the full external email; apparent this somehow triggers PeoplePicker control to query Azure AD for the guest identity.
As this peculiarity in PeoplePicker behavior is difficult to clarify (justify) to business users, I raised a service request: Microsoft Support to clarify the difference in PeoplePicker resolving behavior on interactive typing versus drop the same identity at once. Quickly received answer is that the behavior is by design: on interactive typing, the PeoplePicker queries the User Information List in site collection, however since the guest was not authorized yet for the site his/her account will not be administrated yet in that hidden list. A rather unsatisfying answer - arrogant even -, coming from IT perspective without any understanding for the business user; for whom this deviation in PeoplePicker behavior is absolute unlogical. Moreover as regular member accounts are successful resolved by the PeoplePicker, independent on whether such account is present already in the User Information list. So also there a deviation in regular Member versus Guest accounts.

Wednesday, October 31, 2018

Beware: governance of SharePoint site underneath MS Teams largely gone

In most organizations that employ SharePoint for collaboration and content handling, governance is (tried to be) applied to give some structure and consistency in the SharePoint usage. Some typical elements of such SharePoint governance are:
  • The powerful authorizations via Site Collection Administrator (SCA) role is reserved to IT support only; and business users are authorized via SharePoint Groups + Permission Levels (see e.g. Site Owner vs Site Collection Administrator)
  • Pre-defined site structures (in the old days via Site Definitions; nowadays via Site Templates, provisioning code (e.g. PnP provisioning))
  • Organization consistent branding of the sites: logo, site classification, layout, ...
  • Naming conventions for site titles and URLs
  • Version Control + Content Approval policies
  • Metadata (Managed + Folksonomy)
  • Controlled availability of SharePoint Designer, enabling the business power-users to self-create workflows, customize views, create structure, ...
  • Prerequisites imposed on the site requestor, checked upon by the helpdesk handling site provision process
  • Lifecycle model
  • ...
Microsoft itself acknowledges the importance of SharePoint Governance, and delivers support to its customers on this topic via guidance, trainings and templates (e.g. Overview: best practices for managing how people use your team site).
And then there was the new concept of MS Teams....; highly promoted by Microsoft and moreover highly appreciated and valued by the business users. A.o. the business values the concept of self-service creation of MS Teams, in particular when comparing it with in some organizations the (perceived) cumbersome governance process on SharePoint site provisioning. And some of the business users are even more pleasant surprised when they detect that as part of the MS Teams instance creation, also a.o. a SharePoint Site Collection is created 'underneath'. In the MS Teams concept, Microsoft makes pragmatic reuse of the availability of SharePoint Online as part of Office 365 suite for the Files handling capability. It is even a generic applied pattern by Microsoft to position and use SharePoint more and more as 'backend' underneath other of its products and services. Pre-online this already is done with MS Dynamics, MS Project; and now thus with MS Teams, Office 365 Groups, ...
But although Microsoft intended within MS Teams context the created SharePoint Site to deliver the Files capability, knowledgable and curious business users quickly discover that the connected SharePoint Site is, well a full-blown SharePoint site. Including all the SharePoint capabilities that they are familiar with when working with a standalone provisioned Site Collection. Plus the (default) governance on SharePoint site collections created underneath MS Teams instance is much less restrictive. One of the most significant is that all the MS Teams owners, automatically and outside IT control get assigned in the Site Collection Admin role! Other on permission handling are that the MS Teams owner(s), by deriviation thus also the SCA's of the MS Teams SharePoint site, can share access on the SharePoint level; deviating from the permission management executed on the level of the enclosing MS Teams. Yet another to configure on the SharePoint level sharing with externals, while that might not be configured (even allowed) at the MS Teams level.
How should any Office 365 customer deal with this differences in the governance and support on SharePoint sites created standalone, versus the ones created as element of other Office 365 services? I have not a conclusive view on this yet, but moreover I also have not yet identified such view brought to us by Microsoft. The impression is that Microsoft just kinda 'let it go', and have their customers themselves come to a positioning and best-practice(s). In my opinion this is a shortcoming from Microsoft in support and guidance perspectives: we know that one of the biggest hurdles with IT and Collaboration Tools is the user adoption and confusion about what tool to use when. Delivering the same SharePoint collaboration tool via different creation processes, and with different governance + support models as result, does not help bring a clear collaboration story. I understand and applaud the pragmatic decision by Microsoft to utilize the SharePoint capabilities within other products, but I fail to understand their considerations to make that same SharePoint site on itself accessible; outside the simplified interface via MS Teams.

Saturday, September 22, 2018

Utilize Azure Function to resolve lack of CORS aware within SharePoint Online active authentication flow

In earlier post I informed about facing CORS issue when interoperate from external JavaScript client via OAuth passive authentication with SharePoint Online REST API. And on how-to easily resolve this via Azure Function Proxy. That approach works for OAuth based passive authentication, with the OAuth AccessToken exchanged via HTTP header. However, in case the external client wants to apply active authentication, the situation complicates severely. The explanation is that in this authentication flow, the SharePoint Online authentication token is exchanged as HttpOnly cookie:
The SharePoint Developer Support Team Blog published an article that outlines which set of requests one needs to implement for SharePoint Online active authentication. In case of non-browser clients, this is sufficient to enable SharePoint Online interoperability from an external client. For instance, I've applied this OAuth active authentication flow within an Excel VBA client context, a colleague of mine used it from Curl script, and yet another from a Java application. In a non-browser client context, the received OAuth Active Authentication HTTP response can successfully be parsed to extract the SPOIDCRL cookie.
Modern browsers utilize Cross-Origin protection themselves to mitigate risks: HttpOnly cookie cannot be read by clientside code, and the browser only include cookie for outgoing requests pointing to same domain as from which the HttpOnly cookie is earlier received within current browser session.
Modern secure browser have inherent protection that prevents access to secure HttpOnly cookies: on the wire [Fiddler] the SPOIDCRL cookie is included in the response, yet the browser [here Chrome, via Developer Tools] does not allow access it.
Implication is that the custom CORS handling needs to be further extended to have the browser include the required SharePoint Online authentication cookie cross-domain in SharePoint Online REST API calls. Although same approach via Azure Function Proxy can successfully be utilized to cross-domain request the SharePoint Online active authentication end-point, the browser accepts the returned cross-origin response but your clientside code is not enabled to extract the SharePoint Online authentication token included as HttpOnly cookie (SPOIDCRL). And even if that would be possible, the second blocker is that the SharePoint Authentication Token must be included as cookie for successful authentication again SharePoint Online, but browsers only include the cookie for requests going to the same domain. Meaning that all SharePoint interoperability requests coming from the external JavaScript client must be proxied to the same domain as of the proxied Active Authentication endpoint; <your tenant>/_vti_bin/IDCRL.svc
On the webclient side the needed changes are minimal: include ‘withCredentials:true’ in issued XMLHttpRequest requests. But on the receiving and processing Azure Function (Proxy) side, more must be changed:
 •   Function-Apps have out-of-the-box platform support for CORS; as utilized for the CORS handling in case of OAuth passive authentication flow. However, I experienced this is limited to only return the CORS headers Allowed-Domain + Allowed-Method. Current, Function-Apps platform CORS does not include support for cross-domain authentication handling via ‘withCredentials’. I tried alternatives to set the missing 'Access-Control-Allow-Credentials' header explicitly self in the responseOverrides of the Azure Function Proxy; and learned that in case CORS is configured on the platform level, any Cross-Origin headers in the responseOverrides object are just silently ignored; not included in the resultant http response of the Azure Function Proxy call. Then I tried to instead nullify the Function-App Platform CORS setting, as described in Azure Functions Access-Control-Allow-Credentials with CORS. This works to receive the SharePoint Online authentication token cross-domain.
 •   But on next usage to allow the browser to send the cookie, all your SharePoint Online REST calls must go to the same domain as from which the cookie was received. This can be achieved by also proxy these calls via the Azure Function Proxy. This on itself is very simple to configure in the Azure Function Proxy via generic pattern matching in a new proxy:
However then on invoking that proxy cross-domain from the client / browser, it is refused by Azure Function Proxy with Http 405 / Not Allowed ➔ because I removed all allowed domains at the platform CORS configuration of the Azure Function, while the browser includes in the request the CORS 'Origin' header that now no longer matches on the Azure Function-App level (a typical case of 'chicken-egg' situation).
 •   I then tried with '*' as allowed domain in the platform CORS configuration. But this again results that the explicit additional CORS response headers via responseOverrides object are ignored. And in addition '*' is not allowed on browser level in conjunction with 'withCredentials': Failed to load https://msdnwvstrienspocors.azurewebsites.net/IDCRL.svc: Response to preflight request doesn't pass access control check: The value of the 'Access-Control-Allow-Origin' header in the response must not be the wildcard '*' when the request's credentials mode is 'include'. Origin 'https://wvstrien.sharepoint.com' is therefore not allowed access. The credentials mode of requests initiated by the XMLHttpRequest is controlled by the withCredentials attribute.
 •   As implication of the above enumerated various experiences, I conclude that in their current implementation, Azure Function Proxy is unfit to resolve CORS handling in context with SharePoint Online active authentication flow. Therefore I decided for alternative approach to explicit self build the CORS-aware proxy behavior in custom Azure Functions. I minimal need to implement 2 functions; the CORS aware/allowed access to default MSO endpoint https://login.microsoftonline.com/rst2.srf and https://#ADFSHOST#/adfs/services/trust/2005/usernamemixed (username/password ADFS endpoint) can still be arranged via an Azure Function Proxy, similar as earlier configured for OAuth passive authentication endpoint. But for CORS aware access to combination of IDCRL.svc and any authenticated request to your SharePoint Online tenant the Azure Function Proxy approach thus falls short. For each of these 2, a custom proxy implementation is required.
 •   I observed that also in case of explicit CORS handling coded in Azure Function, still any Function-App platform CORS configuration prevales above that. Therefore it is required to completely disable platform CORS on the Function-App. As the CORS-aware Azure Function Proxies for the other / first 2 requests in the active authentication flow do depend on the platform CORS configuration, I decided to split within 2 seperate Function-Apps with own domain. I could also have resorted to custom proxy build for the first 2 requests and include in the same App container, but whereever possible I prefer a 'configuration' and out-of-the-box approach above (maintaining) custom code. So I stick with utilization of Azure Function Proxy where possible.
Complete CORS-enabled setup for all of the 4 requests involved in SharePoint Online active authentication: the first 2 can be CORS-enabled via Azure Function Proxy, the last 2 must be CORS-enabled via custom Azure Function.
Configuration of the Azure Function Proxies to CORS-enable "<your custom STS>/ adfs/services/trust/2005/usernamemixed" and "https://login.microsoftonline.com/RST2.srf"
Impression of the custom Azure Function: CORS-enable "<your SPO tenant>/_vti_bin/idcrl.svc" is simple; proxy-ing SharePoint REST API calls is more complex and challenging. Note: the custom proxy for active authentication requests in your SharePoint tenant is also fit for non-REST requests, e.g. to browse your SharePoint Online sites
Cross-Domain / CORS prepared JavaScript browser client, interoperating with SharePoint Online REST API


This is what happens in the webclient (browser) / SharePoint Online interoperability with adjustment for CORS:
First request automatic issued by secure-aware browser: OPTIONS to start with CORS Preflight
On server / client confirmed preflight, get cross-domain the SPO authentication cookie
Utilze the retrieved SPO authentication cookie within browser-issued SPO REST calls

Sunday, September 16, 2018

Digest Authenticated API should obey to CORS

On securing access to a service API, Digest Authentication delivers stronger security as Basic Authentication. In case the service API is invoked from JavaScript code via XMLHttpRequest, that client application must explicitly self obey to the Digest Authentication handschake. A convenient library to use for that is digestAuthRequest.js (although I had to tweak it a bit to get it working, for availability of CryptoJS within the library, and to apply only the 'path' part of the URL in generating the digest token). But also the API itself must obey to standards: thus the Digest Authentication protocol, but in addition also to the CORS protocol in case of cross domain usage. Otherwise modern browsers will refuse to read the authentication challenge returned by the API in the first step of the authentication handshake:
The rootcause is that XMLHttpRequest::getResponseHeader() method in its default mode can only access simple response headers, any of: Cache-Control, Content-Language, Content-Type, Expires, Last-Modified, and Pragma (see Using CORS). In this set, the WWW-Authenticate header is missing. So if you want to enable JavaScript clients of your API to successful Digest Authentication and use your API, you have to include that response header name in the value of 'Access-Control-Expose-Headers' response header.
Without properly returned 'Access-Control-Expose-Headers by API, digestAuthRequest fails to access the required WWW-Authenticate header in case invoked from cross-domain JavaScript based client:
With properly returned 'Access-Control-Expose-Headers by API, digestAuthRequest succeeds to access the required WWW-Authentication header in case invoked from cross-domain JavaScript based client:

Wednesday, August 22, 2018

Resolve from conflict between 'Content-Approval' and SharePoint Workflow

Business usage scenario: content management capability to work on draft of data items first, and on peer approval publish copy of the data item into another list.
The business power user recognized self the richness of SharePoint building blocks to realize the scenario:
  1. Generic list with 'Content Approval' and 'Versioning' configured;
  2. SharePoint Designer Workflow on ItemChange; in which the approval status of item is checked, and on condition of 'Approved' create a copy of the item in the 'publish' location.
In theory this setup should function correct. However in (akward) SharePoint practice it does not:
  • The execution of the workflow triggered on ItemChange, results itself that the reached workflow stage is administrated as 'metadata' in the item on which the workflow is triggered.
  • And although the actual content of the item remains unchanged, the standard 'Content Approval' handling treats this as change towards the stage in which the data item was approved; and automatically resets the approval state to 'Pending';
  • When in the workflow the 'Approval Status' field is retrieved, it is therefore already reset to 'Pending'.
Thus not only the publication process fails due this reset of the condition value, but also the data item itself is reset; as if no 'Approval/Reject' decision was made by the peer reviewer. This is beyond confusing, also frustrating; and does not help in adoption of SharePoint as underlying business platform.
The business user consulted me for help. I quickly learned that the experienced behavior is a known issue in the combination of 'Content Approval' and SharePoint Designer Workflow. A functional error which cannot be prevented, that is when utilizing the out-of-the-box building blocks 'Content Approval' plus Workflow. A possible way-out is then to 'build' a custom 'Content Approval'. But that I consider a non-desired and weak approach. The OOTB Content Approval is available to use - for business users via configuration -, so that should be respected and acknowledged.
As prevention upfront is thus not possible, I switched to an approach in which to mitigate afterwards. In the triggered workflow, retrieve the 'approval status' of the previous version of the current item, and use that for the condition evaluation. If 'approved', then publish the item; and in addition also from the workflow execution restore the approved state in the item. The beauty here is that also in this mitigation the richness of the SharePoint platform can be utilized: SharePoint REST Api to retrieve previous version of data item, and standard SPD workflow actions to call Http Web Service plus set the approval status of data item.

Thursday, July 12, 2018

Utilize Azure Function Proxy to resolve lack of CORS aware within passive OAuth authentication flow

The steps to issue OAuth based authenticated SharePoint Online REST API calls from a SharePoint-external client context are well-documented elsewhere, e.g. within Access SharePoint Online using Postman. What is missing in these outlines is the notification that in case the external client is a Javascript based webapplication, the setup will fail due Cross-Origin aka Cross-Domain security protection by modern browsers; unless put in unsafe mode (not recommended!). The problem is here not within the modern browsers, as these are all CORS prepared / supporting. The real cause is that the REST call to 'https://accounts.accesscontrol.windows.net/<tenantid>/tokens/OAuth/2' does not return a CORS aware response; and a browser running in safe mode will refuse to accept the response coming from this different domain.

Browser / Cross-Domain issue

Augment the response to be CORS-aware

For browsers to accept the cross-domain OAuth authentication flow, solution is to modify the received response such that it is augmented with the needed CORS headers. In a first attempt, I tried to augment the response via overriding the (methods of the) XMLHttpRequest object in JavaScript. But not surprisingly this fails: the browser built-in Cross-Origin protection inspects the HTTP response on native level, and cannot be deceived by manipulating the received HTTP response within JavaScript runtime context. From security perspective this makes sense, otherwise the Cross-Origin protection could easily be avoided (seduced) in malicious code.
The valid approach is that the HTTP response as received on HTTP protocol level will itself include the missing CORS headers, before reaching the calling browser. Of course it is not possible to modify as Office 365 customer organization the behavior of the external hosted Microsoft SaaS service. So we need to 'proxy' the external service endpoint, and include the missing CORS headers in the proxy response. Previously this would require to either utilize the capabilities of a reverse proxy in the organization's landscape, or custom code an own endpoint that acts as proxy between the client and the invoked (external) service. But last year Microsoft released the concept of Azure Function Proxy, and this can out-of-the-box be used in a no-code / configuration-only manner to proxy the call to 'https://accounts.accesscontrol.windows.net/<tenantid>/tokens/OAuth/2'.

Configure the Azure Function Proxy

Browser / Cross-Domain allowed via Azure Function Proxy

Tuesday, June 12, 2018

Inject dynamic-filtering into classic-mode ListView

Earlier in my SharePoint "life", I delivered a capability in which a COTS application UI with an ASP.NET GridView, was on-the-fly augmented with dynamic filtering by utilizing list.js library: On-the-fly add client-side filtering and sorting to GridView. On occasion I refer to this as an showcase of how with simple means, a richer user experience can be delivered in SharePoint context. Last week I showed this again, and also this business user was charmed by it. But he asked to have it applied to a standard SharePoint ListView, in particular one in datasheet/quick-edit layout. I took on this challenge, and with successful result.
Screenshots to visualize the effect:
The capability itself is delivered as generic utility and deployed via private CDN. To activate on a list-view page, one merely needs to include reference to the EnrichListView.js library via a ScriptEditor webpart.

High level architecture Microsoft Stream

For reference:

Sunday, May 20, 2018

Authenticate from Curl into SharePoint Online with Modern Authentication

Code-snippet for interoperability from Curl context - for example, could be from a Linux or MacOS workstation / server -, to Office 365 SharePoint Online; with service-based authentication by applying Active / Modern Authentication protocol handling:
#General variables
ProxyAccount="sa-curlAccount"
ProxyPassword="******************"
ProxyProtocol="http"
ProxyServer="xxx.xxx.xxx.xxx"
ProxyPort="8080"
SharePointCurlAccount="sa-curlAccount"
SharePointOnlineTenant="<URL of SharePoint Online tenant>"
UploadFile="<file to upload>"
UploadLocation="<URL of SharePoint Document Library>"

#Fixed variables
OUTPUT=${HOME}/Interop/output
TMP=${HOME}/Interop/tmp/spo

#the following steps are required to upload data from Curl context to SharePoint Online:
#
#1. Retrieve an authentication cookie to Office 365 through invocation of webservices
#1.a. (Optional) Step 0: determine the URL of the custom Security Token Service (STS) to next
#     request a SAML:assertion for account identified by credentials
#1.b. Step 1: request SAML:assertion from the identified custom STS for account identified by
#     credentials
#1.c. Step 2: use the SAML:assertion to request binary security token from Office 365
#1.d. Step 3: use the binary security token to retrieve the authentication cookie
#2. Step 4: Use that Office 365 authentication cookie in subsequent webservice requests to
#   SharePoint Online REST API
 
#1.a. (Optional) Step 0: determine the URL of the custom Security Token Service (STS) to next
#     request a SAML:assertion for account identified by credentials (outside datacenter, with proxy)
curl -U ${ProxyAccount}:${ProxyPassword} -k -x ${ProxyProtocol}://${ProxyServer}:${ProxyPort} -X POST -H "Content-Type: application/x-www-form-urlencoded" -d "login=${SharePointCurlAccount}&xml=1" https://login.microsoftonline.com/GetUserRealm.srf -w "\n" > ${TMP}/O365_response_step_0

#Extract requested STSAuthURL from response step 1
STSURL=`sed -n 's:.*<STSAuthURL>\(.*\)</STSAuthURL>.*:\1:p' ${TMP}/O365_response_step_0`

#Create input for step 1
File: O365_request_step_1-1

<?xml version="1.0" encoding="UTF-8"?>
<s:Envelope
    xmlns:s="http://www.w3.org/2003/05/soap-envelope"
    xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"
    xmlns:saml="urn:oasis:names:tc:SAML:1.0:assertion"
    xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy"
    xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"
    xmlns:wsa="http://www.w3.org/2005/08/addressing"
    xmlns:wssc="http://schemas.xmlsoap.org/ws/2005/02/sc"
    xmlns:wst="http://schemas.xmlsoap.org/ws/2005/02/trust">
    <s:Header>
        <wsa:Action s:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</wsa:Action>
        <wsa:To s:mustUnderstand="1">https://sts.<tenant>.com/adfs/services/trust/2005/usernamemixed</wsa:To>
        <wsa:MessageID>b07da3ec-9824-46a5-a102-2329e0c5f63f</wsa:MessageID>
        <ps:AuthInfo
            xmlns:ps="http://schemas.microsoft.com/Passport/SoapServices/PPCRL" Id="PPAuthInfo">
            <ps:HostingApp>Managed IDCRL</ps:HostingApp>
            <ps:BinaryVersion>6</ps:BinaryVersion>
            <ps:UIVersion>1</ps:UIVersion>
            <ps:Cookies></ps:Cookies>
            <ps:RequestParams>AQAAAAIAAABsYwQAAAAxMDMz</ps:RequestParams>
        </ps:AuthInfo>
        <wsse:Security>
            <wsse:UsernameToken wsu:Id="user">
                <wsse:Username>sa-curlAccount@<tenant>.com</wsse:Username>
                <wsse:Password>*************</wsse:Password>
            </wsse:UsernameToken>
            <wsu:Timestamp Id="Timestamp">
File: O365_request_step_1-2

            </wsu:Timestamp>
        </wsse:Security>
    </s:Header>
    <s:Body>
        <wst:RequestSecurityToken Id="RST0">
            <wst:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</wst:RequestType>
            <wsp:AppliesTo>
                <wsa:EndpointReference>
                    <wsa:Address>urn:federation:MicrosoftOnline</wsa:Address>
                </wsa:EndpointReference>
            </wsp:AppliesTo>
            <wst:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</wst:KeyType>
        </wst:RequestSecurityToken>
    </s:Body>
</s:Envelope>
cat ${TMP}/O365_request_step_1-1 > ${TMP}/O365_request_step_1 echo "<wsu:Created>`date -u +'%Y-%m-%dT%H:%M:%SZ'`</wsu:Created>" >> ${TMP}/O365_request_step_1 echo "<wsu:Expires>`date -u +'%Y-%m-%dT%H:%M:%SZ' --date='-15 minutes ago'`</wsu:Expires>" >> ${TMP}/O365_request_step_1 cat ${TMP}/O365_request_step_1-2 >> ${TMP}/O365_request_step_1 #1.b. Step 1: request SAML:assertion from the identified custom STS for account identified by # credentials (internal datacenter, without webproxy to outside) curl -X POST -H "Content-Type: application/soap+xml; charset=utf-8" -d "@${TMP}/O365_request_step_1" ${STSURL} -w "\n" > ${TMP}/O365_response_step_1 #Extract requested SAML:assertion from response step 1 sed 's/^.*\(<saml:Assertion.*saml:Assertion>\).*$/\1/' ${TMP}/O365_response_step_1 > ${TMP}/O365_response_step_1.tmp #Create input for step 2
File: O365_request_step_2-1

<?xml version="1.0" encoding="UTF-8"?>
<S:Envelope
    xmlns:S="http://www.w3.org/2003/05/soap-envelope"
    xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"
    xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy"
    xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"
    xmlns:wsa="http://www.w3.org/2005/08/addressing"
    xmlns:wst="http://schemas.xmlsoap.org/ws/2005/02/trust">
    <S:Header>
        <wsa:Action S:mustUnderstand="1">http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</wsa:Action>
        <wsa:To S:mustUnderstand="1">https://login.microsoftonline.com/rst2.srf</wsa:To>
        <ps:AuthInfo
            xmlns:ps="http://schemas.microsoft.com/LiveID/SoapServices/v1" Id="PPAuthInfo">
            <ps:BinaryVersion>5</ps:BinaryVersion>
            <ps:HostingApp>Managed IDCRL</ps:HostingApp>
        </ps:AuthInfo>
        <wsse:Security>
File: O365_request_step_2-2

        </wsse:Security>
    </S:Header>
    <S:Body>
        <wst:RequestSecurityToken xmlns:wst="http://schemas.xmlsoap.org/ws/2005/02/trust" Id="RST0">
            <wst:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</wst:RequestType>
            <wsp:AppliesTo>
                <wsa:EndpointReference>
                    <wsa:Address>sharepoint.com</wsa:Address>
                </wsa:EndpointReference>
            </wsp:AppliesTo>
            <wsp:PolicyReference URI="MBI"></wsp:PolicyReference>
        </wst:RequestSecurityToken>
    </S:Body>
</S:Envelope>
cat ${TMP}/O365_request_step_2-1 > ${TMP}/O365_request_step_2 cat ${TMP}/O365_response_step_1.tmp >> ${TMP}/O365_request_step_2 cat ${TMP}/O365_request_step_2-2 >> ${TMP}/O365_request_step_2 rm ${TMP}/O365_response_step_1.tmp #1.c. Step 2: use the SAML:assertion to request binary security token from Office 365 # (outside datacenter, with proxy) curl -U ${ProxyAccount}:${ProxyPassword} -k -x ${ProxyProtocol}://${ProxyServer}:${ProxyPort} -X POST -H "Content-Type: application/soap+xml; charset=utf-8" -d "@${TMP}/O365_request_step_2" https://login.microsoftonline.com/RST2.srf -w "\n" > ${TMP}/O365_response_step_2 #Extract requested binary security token from response step 2 sed 's/^.*\(<wsse:BinarySecurityToken.*wsse:BinarySecurityToken>\).*$/\1/' ${TMP}/O365_response_step_2 > ${TMP}/O365_response_step_2.tmp #Create input for step 3 cat ${TMP}/O365_response_step_2.tmp | cut -d'>' -f2 | cut -d'<' -f1 > ${TMP}/O365_request_step_3 BinarySecurityToken=`cat ${TMP}/O365_request_step_3` rm ${TMP}/O365_response_step_2.tmp #1.d. Step 3: use the binary security token to retrieve the authentication cookie (outside # datacenter, need to pass webproxy) curl -v -U ${ProxyAccount}:${ProxyPassword} -k -x ${ProxyProtocol}://${ProxyServer}:${ProxyPort} -X GET -H "Authorization: BPOSIDCRL ${BinarySecurityToken}" -H "X-IDCRL_ACCEPTED: t" -H "User-Agent:" ${SharePointOnlineTenant}/_vti_bin/idcrl.svc/ > ${TMP}/O365_response_step_3 2>&1 #Remove DOS ^M from response step 3 cat ${TMP}/O365_response_step_3 | sed 's/^M//' > ${TMP}/O365_response_step_3.tmp #Extract requested authentication cookie from response step 3 and create input for step 4 echo "Set-Cookie: SPOIDCRL=`cat ${TMP}/O365_response_step_3.tmp | grep Set-Cookie | awk -F'SPOIDCRL=' '{print $2}'`" > ${TMP}/O365_request_step_4 rm ${TMP}/O365_response_step_3.tmp #2. Step 4: Use that Office 365 authentication cookie in subsequent webservice requests to # SharePoint Online REST API (outside datacenter, with proxy) curl -U ${ProxyAccount}:${ProxyPassword} -k -x ${ProxyProtocol}://${ProxyServer}:${ProxyPort} -b ${TMP}/O365_request_step_4 -T "{${OUTPUT}/${UploadFile}}" ${UploadLocation} exit 0
Alternative for the upload handling; interoperation via SharePoint API / webservice:
curl -U ${ProxyAccount}:${ProxyPassword} -k -x ${ProxyProtocol}://${ProxyServer}:${ProxyPort} -X POST -H "Accept: application/json;odata=verbose" -d "" ${SharePointOnlineTenant}/_api/contextinfo > ${TMP}/O365_response_step_4_tmp

FormDigest=`sed -n 's:.*FormDigestvalue:\(.*\),.*:\1:p' ${TMP}/O365_response_step_4_tmp`
rm ${TMP}/O365_response_step_4.tmp

curl -U ${ProxyAccount}:${ProxyPassword} -k -x ${ProxyProtocol}://${ProxyServer}:${ProxyPort} -X POST -H "X-RequestDigest: @${FormDigest}; X-HTTP-Method: PUT” --data-binary  "{${OUTPUT}/${UploadFile}}"  ${SharePointOnlineTenant}/teams/siteX/_api/web/GetFileByServerRelativeUrl('Shared%20Documents/SubFolder/${UploadFile}')/Files/$value

Friday, May 11, 2018

How-to resolve peculiarity with .aspx file upload from automated client context

The capabilities (powers) of SharePoint as underlying business applications platform can be utilized in multiple ways. Example of a pragmatic one is to utilize SharePoint as authorized web-distribution platform for content created elsewhere. The added value it brings here are that the origin of the content itself does not need to be (made) accessible for the readers, no need to (web) serve content, the permission handling of SharePoint can be utilized to only make the content available for authorized persons.
This simple application usage is for instance applied to continuously publish and distribute system monitoring dashboard report on infra level from Linux servers to the monitoring people. They do not / are not allowed access to the Linux servers in the datacenter, but are granted access to SharePoint as application platform. This worked perfectly, until we recently migrated the hosting site from SharePoint on-prem to SharePoint Online.
The problem symptom is that the uploaded .aspx file on selecting it in the SharePoint Online UI, does not open in browser, but instead starts the ‘Download / Save As’ behavior. Which clearly obstructs the SharePoint role as host of the published infra dashboard. Other .aspx files in the same library that were migrated from the source site on-prem, all do open in the browser. That rules out document library settings. So it must be directly tied to the upload of the file. The particular upload is via Curl – which gave us some challenges to authenticate against SharePoint Online, but I will post on that separately -, but once uploaded nothing can be identified what clarifies why this file behaves different from the other .aspx files in the library. Inspected the document item properties, even up to detailed level via SharePoint Designer: all the same. The only noticeable difference is when trying to resolve in SharePoint Designer via file item properties the url to document: for the troublesome document this returns in ‘file not found’.
Strange, as the file is clearly present; and as such accessible both in the browser via the SharePoint listview UI, as when opening the library in Windows Explorer via ‘Open with Explorer’. Heck, even with sync via OneDrive, the file is included in the synced library content.
So this really kept us puzzled. Until business user self-remembered an action we did on restoring the upload via Curl: as good SharePoint citizen, I reduced the permission level of the automated client account from ‘Full Control’ to ‘Contributor’. This turned out to be the key to explaining and next resolving the issue. On SharePoint level, also uploaded .aspx files are treated as (content) page. And for completed upload + administration, the account uploading an .aspx file must have ”Add and Customize Pages - Add, change, or delete HTML pages or Web Part Pages, and edit the Web site using a Microsoft SharePoint Foundation-compatible editor”. And that permission is missing from “Contributor” permission level. It does have "Add items to library", and therefore the upload itself succeeds from the automated client context. But the next processing on SharePoint (Online) side after the file upload to convert it into a browsable page context is not allowed when only 'Contributor'. The needed permission is included in ‘Full Control’, but that gives away too much control to the automated client account. Applying ‘Least Privilege’security principle, I therefore configured a new Permission Level “Upload ASPX page”, included the needed permission, and assigned this permission level to the automated client account.

Tuesday, April 17, 2018

Peculiarity with Active Authentication issues from VBA

Deriving code-snippets how-to connect + authenticate from SharePoint external automated clients to SharePoint Online, I ran into another peculiarity. This time not on the side of ADFS as STS, but in VBA as automation client. Translating the 'automated client' code from Javascript into Visual Basic for Applications, I quickly had the scenario of Active Authentication with given username and password operational. But next I also wanted to have a working code-snippet for Integrated Active Authentication, based on the NTLM credentials of logged-on interactive user. Only the step to determine the 'saml:Assertion' is here different compared to usernamemixed Active Authentication. However, this first step returned HTTP 401 iso HTTP 200 with the derived 'saml:Assertion'. The request body is correct, as verified via RESTClient.
Logically thinking led to my suspicion that the NTLM credentials of logged-on user are not transmitted from the Excel VBA context. Searching the internet for how-to include the NTLM current credentials in HTTP request from VBA context I found a tip (Windows authentication #15) to use "MSXML2.XMLHTTP" instead of "MSXML2.ServerXMLHTTP.6.0". Bingo, with this change in Request class also from VBA context the Integrated Active Authentication scenario works (already had it proved as working from standalone HTML/Javascript external client.
Private Declare PtrSafe Function CoCreateGuid Lib "ole32.dll" (guid As GUID_TYPE) As LongPtr
Private Declare PtrSafe Function StringFromGUID2 Lib "ole32.dll" (guid As GUID_TYPE, ByVal lpStrGuid As LongPtr, ByVal cbMax As Long) As LongPtr

Private Function GetO365SPO_SAMLAssertionIntegrated() As String
    Dim CustomStsUrl As String, CustomStsSAMLRequest, stsMessage As String
    
    CustomStsUrl = "https://sts.<tenant>.com/adfs/services/trust/2005/windowstransport"
    CustomStsSAMLRequest = "<?xml version=""1.0"" encoding=""UTF-8""?><s:Envelope xmlns:s=""http://www.w3.org/2003/05/soap-envelope"" xmlns:a=""http://www.w3.org/2005/08/addressing"">" & _
            "<s:Heade>" & _
                "<a:Action s:mustUnderstand=""1"">http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</a:Action>" & _
                "<a:MessageID>urn:uuid:[[messageID]]</a:MessageID>" & _
                "<a:ReplyTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous;</a:Address>;</a:ReplyTo>" & _
                "<a:To s:mustUnderstand=""1"">[[mustUnderstand]];</a:To>" & _
            "</s:Header>"
    CustomStsSAMLRequest = CustomStsSAMLRequest & _
            "<s:Body>" & _
                "<t:RequestSecurityToken xmlns:t=""http://schemas.xmlsoap.org/ws/2005/02/trust"">" & _
                    "<wsp:AppliesTo xmlns:wsp=""http://schemas.xmlsoap.org/ws/2004/09/policy"">" & _
                        "<wsa:EndpointReference xmlns:wsa=""http://www.w3.org/2005/08/addressing"">" & _
                        "<wsa:Address>urn:federation:MicrosoftOnline</wsa:Address>;</wsa:EndpointReference>" & _
                    "</wsp:AppliesTo>" & _
                    "<t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey;</t:KeyType>" & _
                    "<t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue;</t:RequestType>" & _
                "</t:RequestSecurityToken>" & _
            "</s:Body>" & _
        "</s:Envelope>"

    
    stsMessage = Replace(CustomStsSAMLRequest, "[[messageID]]", Mid(O365SPO_CreateGuidString(), 2, 36))
    stsMessage = Replace(stsMessage, "[[mustUnderstand]]", CustomStsUrl)

    ' Create HTTP Object ==> make sure to use "MSXML2.XMLHTTP" iso "MSXML2.ServerXMLHTTP.6.0"; as the latter does not send the NTLM
    ' credentials as Authorization header.
    Dim Request As Object
    Set Request = CreateObject("MSXML2.XMLHTTP")
    
    ' Get SAML:assertion
    Request.Open "POST", CustomStsUrl, False
    Request.setRequestHeader "Content-Type", "application/soap+xml; charset=utf-8"
    Request.send (stsMessage)
    
    If Request.Status = 200 Then
         GetO365SPO_SAMLAssertionIntegrated = O365SPO_ExtractXmlNode(Request.responseText, "saml:Assertion", False)
    End If
    
End Function

Private Function O365SPO_ExtractXmlNode(xml As String, name As String, valueOnly As Boolean) As String
    Dim nodeValue As String
    nodeValue = Mid(xml, InStr(xml, "<" & name))
    If valueOnly Then
        nodeValue = Mid(nodeValue, InStr(nodeValue, ">") + 1)
        O365SPO_ExtractXmlNode = Left(nodeValue, InStr(nodeValue, "</" & name) - 1)
    Else
        O365SPO_ExtractXmlNode = Left(nodeValue, InStr(nodeValue, "lt;/" & name) + Len(name) + 2)
    End If
End Function

Private Function O365SPO_CreateGuidString()
    Dim guid As GUID_TYPE
    Dim strGuid As String
    Dim retValue As LongPtr
    Const guidLength As Long = 39 'registry GUID format with null terminator {xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}

    retValue = CoCreateGuid(guid)
    If retValue = 0 Then
        strGuid = String$(guidLength, vbNullChar)
        retValue = StringFromGUID2(guid, StrPtr(strGuid), guidLength)
        If retValue = guidLength Then
            ' valid GUID as a string
            O365SPO_CreateGuidString = strGuid
        End If
    End If
End Function

Sunday, April 15, 2018

Peculiarity with SharePoint Online Active Authentication

To invoke SharePoint Online REST services from automated client that is running outside SharePoint context itself, you have 2 options for authentication:
  1. Via OAuth 2.0; this requires to administer an SharePoint Add-In as endpoint (see post Access SharePoint Online using Postman for an outline of this approach)
  2. Via SAML2.0; against the STS of your tenant
The steps for the SAML2.0 approach are excellent outlined in post SharePoint Online Active Authentication; no need for me to repeat that here. However, a peculiarity I observed is that the handling is not only very picky on the correct messaging formats for respectively getting the 'SAML:assertion' from your STS [step 2], and next the 'wsse:BinarySecurityToken' [step 3]; but it is also very picky on the exact url with which to request the SPOIDCRLToken cookie [step 4]. I created a code snippet in Javascript standalone 'application', and although followed all steps; I ran eventually in an HTTP 401 Unauthorized. While executing via the PowerShell from above post I did get the cookie returned; so definitely working. Comparing the code very closely I identified the troublemaker: the call to <tenant>/idcrl.svc must be with ending backslash: <tenant>/idcrl.svc/. Without that, the call returns 401; with the ending backslash, the SharePoint Online Active Authentication also successful works from a.o. external Javascript (e.g. SAPUI5) application context.

Sunday, March 25, 2018

Optimize for bad-performing navigation in SharePoint Online

Good navigation capability + support is essential for the user adoption of any website. And thus also for (business) sites delivered through SharePoint. The default model for this is and always has been via Structural Navigation, which is rather self-explaining for site owners to set up. However, for performance reasons, Structural Navigation is a bad-practice in SharePoint Online when it concerns site collections with a deeper nested site hierarchy. In short, the cause of this is due the dependency on Object Cache, which has value in on-prem farms with a limited number of WFEs, but is useless in Office 365 where large numbers of WFEs are involved to serve the site collections. See a.o. So, why is Structural Navigation so slow on SharePoint Online? in Caching, You Ain’t No Friend Of Mine.
Microsoft itself recommends to switch to search-driven navigation, and even has some 'working' code to set this alternative up in the context of a SharePoint Online site collection. Noteworthy is that the switch to search-driven navigation requires you to customize the masterpage; something which Microsoft otherwise warns us against. But in the classic experience there is no other option to replace the structural navigation by search-based, and it is an weighed decision to risk modifying the masterpage. Risk which to my opinion is small, as I do not foreseen that Microsoft will make big change changes to the standard 'seattle.master', if any changes at all. Reason is that Microsoft is fully focussing on the modern experience, and little innovation is to be expected anymore in the classic experience. This is underlined by the fact that 'seattle.master' has been stable for years, without any change brought by Microsoft to it.
The 'working' code-snippet that Microsoft provides as part of their advise how-to improve the performance of navigation is included in the Microsoft support article Navigation options for SharePoint Online. Although this is a good first resource, on deeper sight the code has some flaws. Some of them are disclosed in the helpful post SharePoint Search Driven Navigation Start to Finish Configuration. On top of this, in my implementation I included some additional improvements in the ViewModel code:
  1. Encapsulate all the code in it's own module + namespace, to isolate and separate from the anonymous global namespace
  2. On the fly load both jQuery and knockout.js libraries, if not yet loaded in the page context
  3. Made the script generic, so that it can directly be reused on multiple sites without need for code duplication (spread) and site-specific code changes; this also enables to distribute and load the script code from an own Content Delivery Network (CDN)
  4. Cache per site, and per user; so that the client-cache can be used on the same device for both multiple sites, as well as by different logged-on accounts (no need to switch between browsers, e.g. for testing)
  5. Display the 'selected' state in the navigation, also nested up to the root navigation node
  6. Display the actual name of of the rootweb of the sitecollection, iso the phrase 'Root'
  7. Extend the navigation with navigation nodes that are additional to the site hierarchy; and include them also in the navigation cache to avoid the need to retrieve again from SharePoint list per each page visit
  8. Hide from the navigation any navigation nodes that are identified as 'Hidden' (same as possible with the standard structural navigation)
  9. Execute the asynchronous 'get' retrievals parallel via 'Promise.all', to shorten the wait() time, and also for cleaner structured code
  10. Extend with one additional level in the navigation menu (this is accompanied with required change in the masterpage snippet)
  11. Include a capability to control via querystring to explicit refresh the navigation and bypass the browser cache; convenience in particular during development + first validation
Also made some changes to the suggested snippet for the masterpage:
  1. Extend with one additional level in the navigation menu (see above, this is accompanied by required change in the ViewModel code)
  2. Preserve the standard 'PlaceHolderTopNavBar', as some layout pages (e.g. Site Settings, SharePoint Designer Settings,...) expect that to be present, and give exception when missing from masterpage
  3. Optional: Restore the 'NavigateUp' functionality; via standard control that is still included in the standard 'seattle.master' (a source for this: Restore Navigate Up on SharePoint 2013 [and beyond])

Sunday, March 18, 2018

Beware: set site to readonly impacts the permission set overviews

In our execution process of migrating collaboration sites from SharePoint 2010 to SharePoint Online, on completion we apply the following go-live steps:
  1. Set the lock status of source site to ‘readonly' (Lock or Unlock site collections)
  2. Enable a redirect from the root-url of the source site to the root-url of the target migrated site (This is a real success and much appreciated by our end-users: as it is almost impossible for each to remember to update own bookmarks to the sites now migrated. Definitely a best practice I recommend to everyone doing a migration to SharePoint Online !!)
  3. Send out communication to the site owner that his/her site is migrated. Migration issues that were not identified during the User Acceptance Test (UAT) of the migration will be handled as after-care
Immediate after the go-live of a migrated site, the after-care period starts. One of the issues that may arise is that a site visitor has lost authorizations compared to the old source site. We then compare the source vs the migrated site (note: in our custom IIS Redirect HttpModule we’ve incorporated the capability to bypass the automatic redirection) to investigate the complaint. Something to be aware of when comparing the permission set, is that via our go-live actions we’ve impacted also the source site: setting the site to readonly, results that in the “check permissions” display per checked user a large list of “Deny” permissions will be enlisted.
Nothing to be surprised (⇨ 'Deny' permissions are set via User Policy on webapplication level, and cannot be set on the level of individual site collections [1] [2]) or worried about, once you understand where this overwhelming list is actually coming from. And that you can safely ignore them in understanding what the actual “productive” permission set of the checked account was on the old source site.

Monday, March 5, 2018

Users need 'Use Client Integration Features' permission to launch OneDrive Sync from SharePoint Online library

Business users typical like very much the way-of-working via explorer with files stored in SharePoint libraries. Were we in the past limited to doing that either via IE as browser with 'Open with Explorer', or via setting a Map Network Drive; nowadays the better alternative is to work via OneDrive Sync. Our business users underline that advise. However, some reported that they were unable to execute OneDrive Sync from a specific library, while a colleague could do this successful. My initial suspicions were towards issues with browser (this was actually the case with related user issue, failure to 'Open with Explorer' - there the immediate cause was that person's favorite browser is Chrome iso IE, and Chrome does not support 'Open with Explorer' without additional extensions), automatic Office 365 login not working as it should (recent changed in our tenant from old to new sign-in experience, accompanied by a change from 'Keep Me Signed In' to 'Stay signed in'), inconsistency in the components of the local installed Office 365 suite, missing OneDrive license, ...
However, upon following the investigation route that for colleague it works, I compared their permissions. To observe that the business users for which click on Sync does not launch the OneDrive Sync client, miss the specific permission 'Use Client Integration Features' in their SharePoint authorizations. This is a required permission for a.o. OneDrive Sync launch. Included the permission in their applied Permission Level, and problem resolved.

Wednesday, February 21, 2018

Migrated SharePoint 2010 workflow cannot be opened Online in SharePoint Designer 2013

On migration of SharePoint 2010 site to SharePoint Online, it is supported that SharePoint Designer workflows are migrated also. They are classified in their new destination as 'SharePoint 2010 workflows'. There is a difference though qua maintenance tooling: one has to switch from using SharePoint Designer 2010 to SharePoint Designer 2013 edition. Such a switch may result in an issue: "SharePoint designer cannot display the item".
The root cause of this is within the cache-handling of SharePoint Designer. Both the 2010 and 2013 versions use the same local location on your workstation to store cached files. However, the structure within the cached files is not forwards-compatible from SharePoint Designer 2010 to 2013. To resolve the error, one can apply the following approaches:
  1. Disable usage of 'cache' capability in SharePoint Designer 2013: it will then no longer try to load + reuse the cached files that were initially created on your workstation by opening the workflow via SharePoint Designer 2010
  2. Cleanup the local cache to remove the SharePoint 2010 versions of the cached workflow files: delete all cached files from these local locations (Resource: SharePoint Designer cannot display the item (SharePoint 2013))
    • C:\Users\<UserName>\AppData\Roaming\Microsoft\SharePoint Designer\ProxyAssembleCache
    • C:\Users\<UserName>\AppData\Roaming\Microsoft\Web Server Extensions\Cache
    • C:\Users\<UserName>\AppData\Local\Microsoft\WebsiteCache
  3. (Get yourself a new / other laptop:) Open the workflow in SharePoint Designer on another workstation, on which the workflow was not managed previously via SharePoint Designer 2010 when still on SharePoint 2010

Monday, February 12, 2018

PowerShell to assess the external access authorization per site

As clarified in previous post, Azure AD Access Reviews capability although promising qua concept, is in it's current implementation yet unfit to assess the external access per site. But luckily we have PowerShell, which enables us per site, determine the collection of guest authorizations and ask site owner to review + re-confirm the authorizations. Crucial is to provide insight and awareness; who all has access authorization to my business site, and as site / business owner I still are ok with each indivdual guest authorization? For those not / no longer; explicit revoke, for good secure housekeeping in your external shared site.
PowerShell script to assess the external authorization per site in the tenant:
<#
.SYNOPSIS

Access Review of guest users into the SharePoint tenant
#>

#Connection to SharePoint Online
$SPOAdminSiteUrl="https://<tenant>-admin.sharepoint.com/"
try {
    Connect-SPOService -Url $SPOAdminSiteUrl -ErrorAction Stop
} catch {
    exit
}

$externalUsersInfoDictionary= @{}

$externalSharedSites = Get-SPOSite | Where-Object {$_.SharingCapability -eq "ExistingExternalUserSharingOnly"}
foreach ($site in $externalSharedSites)
{
    $externalUsersInfoCollection= @()

    $position = 0
    $page = 0
    $pageSize = 50
    while ($position -eq $page * $pageSize) {
        foreach ($externalUser in Get-SPOExternalUser -Position ($page * $pageSize) -PageSize $pageSize -SiteUrl $site.Url | Select DisplayName,Email,WhenCreated) {
            if (!$externalUsersInfoDictionary.ContainsKey($externalUser.Email)) {
                $externalUsersInfoDictionary[$externalUser.Email] = @()
            }
            $externalUsersInfoDictionary[$externalUser.Email]+=$site.Url       
 
            $externalUsersInfo = new-object psobject 
            $externalUsersInfo | add-member noteproperty -name "Site Url" -value $site.Url
            $externalUsersInfo | add-member noteproperty -name "Email" -value $externalUser.Email
            $externalUsersInfo | add-member noteproperty -name "DisplayName" -value $externalUser.DisplayName
            $externalUsersInfo | add-member noteproperty -name "WhenCreated" -value $externalUser.WhenCreated
            $externalUsersInfo | add-member noteproperty -name "Preserve Access?" -value "Yes"
           
            $externalUsersInfoCollection+=$externalUsersInfo

            $position++
        }
        $page++
    }

    if ($externalUsersInfoCollection.Count -ne 0) {
        $exportFile = "External Access Review (" + $site.Url.SubString($site.Url.LastIndexOf("/")+ 1) + ")- " +  $(get-date -f yyyy-MM-dd) + ".csv"
        $externalUsersInfoCollection |  Export-Csv $exportFile -NoTypeInformation
    }
}

# Export matrix overview: per user, in which of the external sites granted access
$externalUsersInfoCollection= @()

$externalUsersInfoDictionary.Keys | ForEach-Object {
    $externalUsersInfo = new-object psobject
    $externalUsersInfo | add-member noteproperty -name "User Email" -value $_

    foreach ($site in $externalSharedSites) {
        if ($externalUsersInfoDictionary[$_].Contains($site.Url)) {
            $externalUsersInfo | add-member noteproperty -name $site.Url -value "X"           
        } else {
            $externalUsersInfo | add-member noteproperty -name $site.Url -value ""             
        }
    }

    $externalUsersInfoCollection+=$externalUsersInfo    
}

$exportFile = "External Access Review user X site - " +  $(get-date -f yyyy-MM-dd) + ".csv"
$externalUsersInfoCollection |  Export-Csv $exportFile -NoTypeInformation

Disconnect-SPOService

Friday, February 9, 2018

Azure AD Access Review yet useless for SharePoint External Sharing

In order to remain compliant with company-internal information security policies, it is essential to regular assess the authorizations of external guests to the external shared SharePoint Online Sites. At Ignite 2017 Microsoft announced the Azure AD capability Access Reviews. Initial I was rather enthusiastic about the concept of 'Manage guest access with Azure AD access reviews', but after some evaluation I make the personal conclusion that in the current implementation stage it is pretty useless to assess SharePoint external access.
In the current setup you can select between 2 modes to assess:
  1. Assess on Azure AD Group Membership
  2. Assess on access to an Office 365 application
However, both are useless for assessing the access to one or more specific SharePoint Online sites. In Azure AD B2B based external sharing, externals are invited to a SharePoint site via their Azure AD guest account. In this model, the guests access is neither via a specific Azure AD Group, nor are they on Azure AD level specific authorization to SharePoint Online as application. Their authorization to SharePoint as application is implicit, resulting from their invitation to one or more specific sites.
I played a bit with the 'access review' (note: the documentation on it is very scarce, and incomplete):
  • In the review mode on 'O365 SharePoint Online as application'; I get no results at all.

  • In the review mode on 'Group Membership' I selected the dynamic group that includes all guest accounts. With this review mode I do get results to review their access. But the value is limited to gain insight on last logon per guest account. You can then as reviewer make a decision to Approve or Deny the continued group membership. But in reality this review decision cannot be effectuated: the group membership is dynamic, based on condition; not on concrete addition to the group.

    Access Review on (dynamic) Azure AD Group membership Applied Access Review decision on (dynamic) Azure AD Group membership

My thoughts shared with product team + community

I reported my 'negative' evaluation as feedback to a contact in the Azure AD productgroup: "I question how it would be applied: removing the 'refused' accounts from the Dynamic Group does not make sense; they should be blocked or removed from Azure AD to block access. Also, as a site owner only wants to take responsibility for access to his/her site, the access decision application should be applied there. Not on tenant level."

His response: "I think you have some interesting use cases. As the product is still in preview, documentation is limited. I will discuss your use cases with my colleagues in Redmond responsible for Access Reviews."

In addition, I also submitted a SharePoint uservoice idea: Azure AD access review on level of single (shared) site collection

Nice post on the topic, including 'manual': Checking Office 365 Group Membership with Azure AD Access Reviews

Saturday, February 3, 2018

(Risky) Approach to invoke SharePoint Online API on whatever site from external automation client

For an external automation client to access SharePoint Online via the API, same holds as for you as human visitor: it first needs to authenticate itself against the accessed SharePoint Online tenant, and next it must have the needed authorization in the specific accessed SharePoint content entity (site collection, site or list). This can be achieved by registrating an Add-In, and assign it the needed permissions. Then that Add-In can be used as authentication + authorization endpoint for the external client. Post Access SharePoint Online using Postman includes a good reference of the steps that you should automate in the external client. Drawback of the described approach is that it is on scope of individual site collection or even site. The Add-In registration is not utilizable as generic gateway to all sites in the SharePoint Online tenat.
To extend the scope to full tenant, the Add-In Only permissions must be requested on the 'tenant' level'. This can only be done by a tenant administrator; and must be done via the tenant-admin site (How to provide add-in app only tenant administrative permissions in SharePoint Online). But be very careful and considerated before going this approach: the implication is that whoever knows the client id and secret of the registrated Add-In, is now enabled to access via an external client data from whatever site in your tenant. Making site-specific permission management pretty useless / even a laughter. But your information security will certainly not consider it a good + acceptable joke...
With Add-In Only permission on 'tenant' level; external client that knows the Add-In's client id + secret can request an accesstoken; and then use that same single token to access whatever site in the SharePoint tenant: