Saturday, December 27, 2014

Conceptual approach for exposing external SAP systems into SharePoint channel

The last 6 years I have focussed on utilizing SharePoint as additional channel to expose SAP data and operate SAP functionalities. As I am now about to start another challenge in the broad SharePoint arena, I decided to write down my advice on how-to achieve such SAP/SharePoint integration. What approach to take, architectural advice, and what steps. On intent I do not elaborate on technology products, as these by nature are only temporarily. Instead I focus my outline on concepts.

Rough outline of approach

  1. Establish the reference solution architecture for integrating SAP business process handling into SharePoint front-end / UI
  2. Establish which of the business processes to expose via SharePoint as additional channel, versus which to keep solely through SAP UI channel
  3. Establish the IST situation: Business Process definitions and supportive landscape, Enterprise IT strategy + roadmap, IT landscape.
  4. Establish the SOLL situation.
  5. Determine the IST-SOLL gap
  6. Map the solution architecture + IST-SOLL analysis onto concrete technology: Microsoft and SAP.
  7. Architect, design, develop, test and implement specific scenario(s).

Architecture guidelines

  • Apply a layered architecture
  • Loosely-connected front-end / back-office system(s) landscape through the usage of interfaces (contracts)
  • Interoperability standards based: webservices, security standards, monitoring
  • Abstract the LOBsystem business process handling as a contract-specification
  • Extend the decouped operation in alternative front-end with the concept of Operational Data Store to enable temporary administration outside the external LOB system, and allow complete preparation of administrative action before submitting into the external Lines of Business system.

Main steps for a specific scenario

Per specific scenario / use case to expose external LOB system into SharePoint
  • Identify the data and functionalities of the external LOB system that you want to expose via SharePoint as (additional) channel for user operation.
  • Identity in the platform of the external LOB system context, the building blocks (functional and technical) that can be used to operate the identified data and functions. The external LOB system, and its internal working, is a (magical) blackbox for the unknowingly. You need the aid of a knowledgable business analyst to help with identification of the proper functional building blocks, and how to utilize and invoke them.
  • Service-enable the identified building blocks. Don’t be fooled by the concept of SAP Enterprise Services. In practice, these are only usable in a (SAP) laboratory scope, not in the concrete situation of an end-organization. In reality you need to provide an own service layer to expose the identified building blocks. SAP has acknowledged this, and provides multiple supporting foundations and building blocks. In my opinion, current the most significant is SAP Gateway to expose SAP data and processes as REST services. For consumption in Microsoft clients, SAP delivers Duet Enterprise and Gateway for Microsoft that extend on the basic interoperability capabilities of SAP Gateway. In addition to SAP, frameworks of other suppliers are also available on the market to service-enable SAP. A disadvantage of them is that it may complicate your IT landscape. Trend of latest years in end-organizations is to consolidate on SAP + Microsoft IT policy

Saturday, November 15, 2014

Rolling out Nov 2014 CU for SP2013 broke Enterprise Search

Against better judgement, I tried this week to install the November 2014 Cumulative Update for SharePoint 2013 almost immediately when it was announced. The installation on our single-server farm failed, I suspect due insufficient disk space (one more motivation to consider moving our demo landscape to the Azure cloud…). Besides that the November CU was thus not installed, the more problematic consequence was that parts of our SharePoint landscape appeared broken. The installation made some changes during its faulted execution, that were apparently not all rolled back:
An unhandled exception of type 'System.ServiceModel.Security.MessageSecurityException' occurred in Microsoft.Office.Server.Search.dll
Additional information: The HTTP request was forbidden with client authentication scheme 'Anonymous'.
This error occurred on trying to construct a KeywordQuery instance. Being in the SharePoint business for a longer time, I decided to not spend too much time trying to locate the problem cause; and just start with the simple first approach to reboot the server. And how surprising: this worked, and our SharePoint 2013 farm is again working correct. Yet without the Nov 2014 CU, for which in the meantime Microsoft already issued a corrective fix. But now I’ll wait with trying to roll that out until proven in the field, and/or we actually need it in our SharePoint 2013 landscape.

Wednesday, November 5, 2014

Recover from SharePoint 2013 Search and CryptoGraphic mismatch - 'Key does not exist'

SharePoint Search Application is applied to crawl external data via Duet Enterprise 2.0. In the Duet Enterprise authentication flow from SharePoint to SAP, the SSA process via Business Connectivity Services invokes Secure Token Service to runtime create an X.509 authentication user certificate for the SharePoint account under which the search crawling is executed. (See 'How authentication works in Duet Enterprise 2.0').
This worked fine, until I upgraded the SharePoint 2013 landscape to the latest released cumulative updates: Sept + Oct 2014. Part of the upgrade steps is to temporarily disable the SharePoint Search services, and restart them after the CU installations. However, afterwards it appeared that the runtime Duet Enterprise SSO behavior was broken. The crawl log on the external content source reported structurally the error Exception in invoking the ODataExtensionProvider of type 'OBA.Server.Canary.ObaOdataServerExtensionProvider'. And the ULS contains on constant basis the error 'The search connector framework caught an exception from BDC: Exception in invoking the ODataExtensionProvider of type 'OBA.Server.Canary.ObaOdataServerExtensionProvider'. (Key does not exist. )'
But this is only when the BCS OData service is invoked from Search Crawling context. Using the same SharePoint user credentials to interactively retrieve SAP data in a SharePoint site still works, and successful retrieves the external SAP data applying Duet Enterprise 2.0 Single Sign-On.
Recovery fix:
Restart the SharePoint Search service (OSearch15), to force a reset of the runtime memory in that process and resync with CryptoGraphic on Windows OS level.

Monday, November 3, 2014

On-the-fly add client-side filtering and sorting to GridView

ASP.Net GridView is a powerful UI-control to visualize an overview of (business) data entities. Also standard / COTS products (e.g. for SharePoint) make heavily usage of the GridView control and it's capabilities.
In case of using a COTS product, it can be challenging to address requests from endusers wrt the grid behavior. You do not have the option to change the server-side behaviour yourself, but are dependent on the supplier. Ok, so server-side falls off; but in these days we prefer client-side anyhow. As it gives a faster interactive ui-responsiveness and thus a better user-experience.
In our scenario, the customer requested to add filtering and sorting ui-behaviour to an overview grid. As the output of GridView in the browser is simple an HTML Table element, I searched on the internet for any javascript library to sort and filter on tables. There are several of these available. I picked the combination of List.js and jQuery.TableSorter.js.
Next step is to activate their clientside behaviour in the rendered GridView output. This requires some straightforward jQuery-coding: select the Table element, enrich it which new child elements that are needed by the sorting and filtering libraries, runtime import the javascript libraries, and activate the sorting respectively filtering clientside behaviour.
/* Add client-side filtering and sorting to the requests overview. */ $.getScript( "/.../scripts/list.js", function( data, textStatus, jqxhr ) { $("#RequestsOverview tbody").addClass("list"); var filters = '<tr > \ <td><div id="filterPH1"> </div></td> \ <td><div id="filterPH2"> </div></td> \ <td><input type="text" id="date" class="search" placeholder="Filter"</td> \ <td><input type="text" id="description" class="search" placeholder="Filter"</td> \ <td><input type="text" id="state" class="search" placeholder="Filter"</td> \ </tr>'; $("#RequestsOverview thead tr").after(filters); $("#RequestsOverview").parent().attr('id', 'RequestsOverviewDiv'); $("#RequestsOverview .wf-id-event").each(function(i) { $(this).parent().children().each(function(i) { switch(i) { case 2: $(this).addClass("date"); break; case 3: $(this).addClass("description"); break; case 4: $(this).addClass("state"); break; } }); }); var userList = new (List)('RequestsOverviewDiv', { valueNames: ['date', 'description', 'state'] } ); }); $.getScript( "/.../scripts/jquery.tablesorter.js", function( data, textStatus, jqxhr ) { $("#RequestsOverview").tablesorter( { cssHeader: "headerSort", headers: { 0: { sorter: false}, 1: {sorter: false} }, dateFormat: 'pt' }); });

Result:

Without clientside plug-in of sorting and filtering:
Grid ui-behaviour enriched with sorting and filtering:
Limitation:
As the plugged-in sorting and filtering works on the client-side available data, it cannot be used in case of server-side GridView paging. For that it is required to also sort and filter server-side, or replace the server-side paging by a clientside approach. The latter is typically not desirable, as this requires that all the data is immediately send to the client; while the user may be interested in only the first page.

Friday, October 31, 2014

Expose SAP data via Office 365 API to iOS and Android?

At Microsoft TechEd Europe, an important announcement is the new Office 365 APIs for iOS and Android (native) Apps. Through the Office 365 APIs, iOS and Android developers will be enabled to directly consume and utilize Office 365 entities in native mobile Apps. Initial this encompasses Office 365 mails, calendar, contacts and files/documents. Later on the roadmap also tasks, Yammer (social) and Office Graph will be made available via the Office 365 APIs.
In my previous posting I reported on the latest version of SAP Gateway for Microsoft ('GWM Azure'), that enables access to SAP data into the Office 365 context. With the announcement of the new Office 365 APIs, this gives an interesting lookout. Will it also be possible to disclose SAP data by the combination of GWM + Office 365 APIs, for usage in native iOS and Android Apps? If so, some strong use cases for new type of composite business Apps will be made possible; in which Office 365 (personal) productivity data is combined with the business data in SAP. Direct available on the nowadays preferred business channel, being tablets and smartphones. Will be interesting to closely watch the actions of SAP and Microsoft on this…

Thursday, October 30, 2014

Expose SAP data into Office 365 productivity clients

On 16 September, SAP launched an update of their product SAP Gateway for Microsoft (GWM). Code-named "GWM Azure", this update focusses on integration of SAP backend data and functionality, through SAP NetWeaver Gateway, in the Microsoft Azure cloud and Office 365 tenants.
As a member of the Customer Engagement Initiative group on SAP-Microsoft Interoperability, I was fortunate to participate in the product’s customer validation preceding the product launch. In our customer validation (CuV), I focussed on ‘enterprise-ready SSO’. With this phrase, I mean a robust, controlled and foremost enterprise-scaleable way to give employees via the Office 365 context access to the on-premise SAP data and functionality. In practice this translates for me into rely on authentication standards like SAML2, OAuth2, X.509 certificates; and not make use of username/password (weak) authentication. Mind you, the latter is fine for initially playing around and executing PoC’s. But it is not a secure and maintainable approach when addressing productive scenario’s with larger user groups.
The outcome of our CuV participation turned out very well. I could proof in ample time that the SAML2 based Single Sign-On from our Office 365 tenant via GWM Azure into our on-premise SAP landscape (Gateway + business suites), well... simple works!! The access to the SAP data in the Office 365 clients is still authorized based on the SAP authorization permissions and roles. An Office 365 user is only granted access to SAP data and functionality in compliance with his/here role in the SAP business systems.
With the release of GWM sp3 (GWM Azure), the availability of an organization’s SAP business data, can easily but still secure be extended to the Office 365 productivity clients. With Microsoft putting strong emphasis on the Office 365 proposition, and lots of organizations actually buying into this (including new customers for Microsoft, as consequence of Microsoft aggressively targetting the small and midsize business market (SMB)), this adds a powerful new business proposition; for the Office 365 ecosystem but also for individual Office 365 subscribers.

Wednesday, September 17, 2014

Tip: HowTo mitigate impact of IE8/IE9 style-tag limit

Many organizations still have IE8 or IE9 as standard browser installed on their employee workstations. Internet Explorer versions before IE10 have an hard boundary on the maximum number of stylelink imports that are applied, namely 31 (see Microsoft Support: 'A webpage that uses CSS styles does not render correctly in Internet Explorer'). Imported link-entries beyond that maximum are effectively ignored in the IE8/IE9 output rendering. This results in missing a part of the applied CSS-styling, and different rendering in IE8/IE9 compared to other browsers (FireFox, Chrome, Safara, IE10/11). The limit of 31 seems large enough, or stated differently: that you would need more than 31 seems ridiculous and unmanageable. But be aware that in the concept of SharePoint, content editors compose functional experiences by placing multiple and mutual independent webparts on SharePoint content pages. These webparts - common-of-the-shelf (Microsoft, any from the large SharePoint community), or custom build - are self-contained, also for their CSS styling. Also the standard SharePoint:CssLink already adds (consumes) 4 css-link imports to the generated page html. So when multiple webparts added to a page, the limit of 31 link-entries in the total page html can certainly be exceeded.
The IE8/IE9 link-limit is an hard number, not possible to increase that. If it is not yet possible or not planned for an organization to migrate to a later IE version (or another browser), you must apply an approach to mitigate the impact of the link-limit. For a part this can be achieved by collapsing multiple link-references within one 'style' island in the outputted html. IE8/IE9 regards plus applies that as only one single style element, and this way you can have multiple link-imports that together only consume 1 of the 31 available stylelink-slots.
Example:
Replace:
<SharePoint:CssRegistration Name="<%$SPUrl:~SiteCollection/Style Library/css/cssfile1.css%>" runat="server"/> <SharePoint:CssRegistration Name="<%$SPUrl:~SiteCollection/Style Library/css/cssfile2.css%>" runat="server"/> ...
Into:
<style type="text/css"> <!-- @import url("../../cssfile1.css"); @import url("../../cssfile2.css"); ... --> </style>
This approach is not the silver bullet to 100% circumvent the hard-imposed IE8/IE9 style-tag limit, but it might save you just sufficiently on occupied stylelink slots to leave sufficient slots for the COTS webparts placed on a page. This was so for myself in a customer project with a COTS SharePoint-based product implementation.
Note:
It is also good to realize that IE8/IE9 emulation modus in IE10 and later, does not impose the link limit. And therefore on this aspect the IE8/IE9 emulation will be different than browsing from real IE8/IE9 browsers.

Friday, September 5, 2014

PowerShell to repeatedly provision SharePoint master and config data

The ease of creating and maintaining SharePoint lists also makes them a good option for application configuration. As example, in one application I applied SharePoint list as the configuration storage for task types specification, in another as the configuration storage of the specifications for BCS invocations via a generic BDC model. However, as easy it is to create a list; so it is also to delete one including the configuration contents. In particular in the development phase, with repeated solution deployment and feature de- and reactivation, typically the configuration list and thus its contents get deleted.
To cope with this, I utilize PowerShell to refill the list with the desired configurations. And the same PowerShell scripts are also of use for transition of the application from development landscape to QA, and ultimate to production.
Example:
# script Clear # Fill Configuration $webApp = “<url>" $ApplicationConfigList="/Lists/ApplicationConfiguration"   # functions   function X-CheckItemNotExists($list, $itemTitle) {    $camlQuery =       "<Where>          <Eq>             <FieldRef Name='Title' />             <Value Type='Text'>" + $itemTitle + "</Value>          </Eq>        </Where>"    $spQuery = new-object Microsoft.SharePoint.SPQuery    $spQuery.Query = $camlQuery    $spQuery.RowLimit = 1    $destItemCollection = $list.GetItems($spQuery)    return $destItemCollection.Count -eq 0 }   # Initialization   $spWeb = Get-SPWeb $webApp   # (Re)fill the Application Config Settings   $ApplicationConfigSettings = @{    "0" = @("<config-item title>", "AovpGetPerson ", "{0,8:D8}{1,30: 30}", "<SAP-system-ID>");    "1" = @("< config-item title>", "CrmGetCustomer", "{0,8:D8}{1,8:D8}", "<Oracle-system-ID>") }                                $spApplicationConfigList = $spWeb.GetList($ApplicationConfigList) foreach ($array in $ ApplicationConfigSettings.values) {    if (X-CheckItemNotExists -list $spApplicationConfigList -itemTitle $array[0].ToString())    {       $spApplicationConfigItem = $spApplicationConfigList.AddItem()       $spApplicationConfigItem["JsonMethodName"] = $array[0].ToString()       $spApplicationConfigItem["GenericModuleMethodName"] = $array[1].ToString()       $spApplicationConfigItem["CallInputParamsFormat"] = $array[2].ToString()        $spApplicationConfigItem["ExternalSystemAlias"] = $array[3].ToString()          $spApplicationConfigItem.Update()    } }    # Release   $spWeb.Dispose()

Tuesday, August 26, 2014

1st lessons learned from applying Content Enrichment web service callout

In my previous post I reported how I successfully applied the new capability within SharePoint 2013 Enterprise Search to enrich the crawled content. That is immediately my first lesson: SharePoint 2013 Content Enrichment web service callout is far more easy to utilize compared to the Item Processing capability of FAST Search Server for SharePoint 2010. You can setup a working scenario in matter of hours, while FAST Item Processing typically costs you days. Moreover as it is badly documented.
However, I also do have some lessons learned to keep in mind when utilizing Content Enrichment Service callout:
  • Be aware that if you leave out a trigger in the ContentEnrichmentConfiguration object, the callout will occur for each crawled data element. In a production scenario this can seriously increase the total crawling time.
  • Be aware that the Content Processing pipeline invokes the Content Enrichment web service via anonymous service calls. This means it is not viable to deploy as web service in a SharePoint webapplication, as this typically is an authenticated web resource. Best is to deploy the Content Enrichment web service in an own anonymous IIS website.
  • You can only configure one single ContentEnrichmentConfiguration object per SharePoint Search Application (SSA). The consequence is also that you can only use one single Content Enrichment web service to enrich the crawled content in this SSA. This is a serious limitation!! I found a post with a creative solution to workaround the single endpoint limitation: Using Multiple Endpoints as a Content Enrichment Web Service in SharePoint 2013 Search. But a real solid approach has to come from Microsoft itself, by removing the limitation of only 1 ContentEnrichmentConfiguration per SSA.

Sunday, August 24, 2014

Content Enrichment callout to properly map crawled data format

In combination with SharePoint BCS, SharePoint Enterprise Search can crawl external business systems as datasources. Starting SharePoint 2013, BCS can now also consume REST/OData services. In an OData service response, the type of all data fields is by nature text. The values of SharePoint Search crawled properties are all also of raw text. In the Enterprise Search content processing pipeline this text-based value can be mapped to a managed property of specific datatype, e.g. text, integer, decimal, date and time. In order to successful and meaningful map from text dataformat to specific datatype, the text format must be parsable via current localization / culture into that specific datatype. In case not, the mapping will fail and the value of managed property will be null.
When crawling an external data source, you typically do not have control over the dataformat. In case the dataformat does not match the localization format, the mapping thus fails.
Example: OData returns date-information in format ‘YYYYMMDD’; the default localization datetime formats do not support this and the mapped managed property of type Date and Time contains null value:
In such situation, you can utilize SharePoint 2013 content enrichment capability to explicit parse the crawled non-localized dataformat.
The approach is as follows:
  1. Remove the mapping from the Managed Property of type Date and Time
  2. Create a new Managed Property of type text, and map this to crawled property
  3. Create an implementation of IcontentProcessingEnrichmentService; with the ProcessItem method set to parse ‘YYYYMMDD’ into a Property<DateTime>
  4. Configure SharePoint Search Application to map the new managed property (input) to the datetime managed property (output)
  5. Issue a full crawl on the business data content source.
The result in SharePoint index:

Friday, August 15, 2014

Tip for ‘Secure Store Service application is not accessible’

This week I was forced to restart our SharePoint 2013 single-server system. After the server had restarted, Duet Enterprise 2.0 Single Sign-On gave an error: The Secure Store Service application Secure Store Service is not accessible. The full exception text is: An error occurred while making the HTTP request... Opening Secure Store Service Application in Central Management displayed the same error. I checked IIS Manager: the applications pools of the SharePoint Web Services were all running. Retried to recycle, but problem stayed. Even after a brute IISReset.
I already was preparing myself for inspecting the full identity trail of the app pool account to detect whether comprimized, and/or potentially even repair Secure Store Service application. But in a bright moment, I thought of inspecting ‘Manage Services on Server’. Secure Store Service showed in the overview as ‘Started’. As quick trial-and-error attempt, I stopped and next (re)started the Secure Store Service. And guess what: this resolved the issue! Easy infra fix after all, when you think of it.

Monday, July 21, 2014

Connect SharePoint 2013 to SAP via Gateway for Microsoft

SAP initially developed Gateway for Microsoft [GWM] as ‘Duet Enterprise beyond SharePoint’. Allow Gateway OData service consumption in other Microsoft front-end formats, like Microsoft Office clients (Outlook, Excel, Word, PowerPoint). In essence GWM can also be regarded as a trimmed-down Duet Enterprise 2.0 variant; provides some of the same basic integration capabilities, but not all. Question that arises: can you also utilize GWM to connect SharePoint to SAP?
The answer is a firm Yes you can! Which is logical, as SharePoint itself is a .Net application. You can apply multiple approaches to consume the SAP data through a GWM generated reference proxy into SharePoint:
  1. Connect to the Gateway service proxy from a SharePoint web part context [mind you, Microsoft urges us to step away from this SharePoint server-side model; but it is still possible and supported]
  2. Build a custom BCS Connector to connect to the Gateway service proxy, and use that connector from SharePoint to render the received SAP data via standard External List, Business Data WebParts, or a custom build web part that invokes the BCS Api
  3. Build a SharePoint WCF RESTful service to connect to the Gateway service proxy, and consume that service in a browser-based front-end; using a databinding library like knockout.js, Angular.js. This is an example of SharePoint 2013 App model, and can also be applied in Office 365 / SharePoint Online.

SharePoint version

SharePoint 2010 has been build “ages ago”, and has a hard dependency on .Net framework 3.5. In the current year, the .Net framework has progressed to .Net framework 4.5, and the latest version of SharePoint [2013] has catched up with that. And the same holds for Duet Enterprise 2.0 [which is actually for SharePoint 2013, if you have SharePoint 2010 you need the Duet Enterprise 1.0 version], and also GWM: both are .Net Framework 4.x dependent. As consequence, I have to relax my firm statement a bit: Yes, GWM can be used to connect SAP and SharePoint 2013; but not for the older SharePoint versions (2003/2007/2010).

Simple example for Proofing

To demonstrate, I made up a sample scenario to retrieve SAP CRM data via GWM into SharePoint. I used the demo Gateway services as data source (see article by Martin Bachmann for instruction how to get access to the SAP Gateway demo system), and our own SharePoint 2013 environment as front-end. Further I have Visual Studio 2013, and necessarily also the latest GWM sp3 (versions preceeding sp3 will not install in Visual Studio 2013; but they do install in Visual Studio 2010/2012).
I applied the guidance provided in the GWM Developer Guide, but instead of Windows Forms project, choose SharePoint Visual Web Part template.
Next, you first need to add references to .NET WCF and GWM libraries. The easiest way to achieve that is by utilizing the GWM Visual Studio Add-In. Click on ‘Add SAP Service Reference’:
fill in as Service Url the url to GWdemo service, and press ‘Go’:
In the Service Explorer you can explore and inspect the OData entities that the GWdemo service exposes. Select one, and click 'Ok'. Direct result is the inclusion of multiple GWM assemblies (note that they are still named with the older GWPAM naming) in the Visual Studio project references:
Next, open the visual webpart and put in some code to consume the GWM created proxy reference and display received data. As this is merely a short demo to proof the connectivity, I simple display the SAP CRM BusinessPartners through a plain GridView control:
Build the Visual Studio project, and deploy the SharePoint solution. Then browse to your SharePoint site, create a new page, and add the GWM-build web part. And voila, we have SAP data in SharePoint:
All in all, building this took me less than an hour. Of course it is only a simple retrieval example; and no effort spend whatsoever on achieving a good looking and behaving UI. Also for this simple example I relied on basic authentication. For a trustworthy enterprise context, this is not usable and either OAuth, X.509 or SAML must be applied. But still, the outcome is very promising with respect to the SAP/SharePoint interoperability capability of GWM.

Fixes needed to GWM generated code

Issue 1: Microsoft.Sharp not included in references
Solution: Add the missing assembly to the Visual Studio project reference
Issue 2: Configuration Reader tries to read from windows form based path
Solution: avoid the attempt to read from windows-forms based path
Issue 3: The used GWDemo system gives error
Solution: invoke another method…
Honestly: this is an example of why in every SAP interoperability project also domain knowledge is required --> contactpersoncollection is connected to a business partner key, you cannot invoke this without one.

Does GWM replace Duet Enterprise?

The answer to this is a clear No: Duet Enterprise as framework has more capabilities as GWM: SAP workflow integration, SAP BW reports publication, roles synchronization, user profile synchronization. Also Duet Enterprise has complete self-contained support within for Single Sign-On between SharePoint and SAP, while with GWM you need additional infra to achieve this (e.g. X.509 certificates infra in your landscape).
The better question is however: in what scenario’s would GWM be sufficient? Well, it might be sufficient in scenarios where all you need is the connectivity from SharePoint to SAP for data CRUDQ actions, and you already have Single Sign-On supporting infra in your IT landscape.

Thursday, July 10, 2014

HowTo include inline-CSS in SharePoint 2013 CEWP / Disable spellcheck

During the initial elaboration phase of a SharePoint project, CEWP is a handy SharePoint functionality to quickly setup a visual sketch of the application. This enable you to present the end-users a first impression of how the application is going to look and behave like. In these modern days it is thereby key that the content and functionality is rendered in an attractive and user-appealing manner. This is where CSS comes in.
If you include inline-CSS in a CEWP, you might experience that when you save the SharePoint page the CSS is automatically modified aka corrupted. This effect is caused by SharePoint spell check. To avoid this effect, simple wrap the inline-CSS in a HTML element with class 'NoSpellCheck'.
Example:
<div class="NoSpellCheck"> <style type="text/css"> .PeopleClassification { width:100%; margin-top:1px; font-size:12px; } </style> </div>

HowTo login as different user in SharePoint 2013

A SharePoint platform feature that I use a lot during testing of SharePoint functionalities is simulate another user. However, in SharePoint 2013 the menu item 'Sign in as different user...' is no longer available. A simple trick to still get the ability is navigate the browser to:
http://<siteurl>/_layouts/closeConnection.aspx?loginasanotheruser=true
This trick works in all browsers (IE, Safari, Chrome, FireFox).

Saturday, July 5, 2014

Programmatically set decision of Duet Enterprise task

In our award-winning VIEW (Virtual Integrated Enterprise Workplace) concept we include a generic inbox. In the generic inbox tasks are aggregated from multiple sources: SAP, SharePoint, Oracle, proprietary systems, and so on. The user sees in his/her inbox all outstanding tasks from the diverse task backends. And it is also supported to direct handle the tasks from the generic inbox.
To make the generic inbox real also for Duet Enterprise tasks, it is required to programmatically approve them via the inbox instead of via the standard Duet Enterprise task forms. A good start how to achieve this is article ‘how to create Duet SharePoint webparts’ of Ravi Sharma, in which he uses programmatic Duet Enterprise task approval as an example.
The approach Ravi takes in his setup is to directly update the SAP task by invoke via SharePoint BCS Application Programming Interface (API), explict self the wfUpdate method on the DuetEnterprise WorkflowTask External ContentType. And if the SAP task update is successful, also afterwards alter the SharePoint task that functions as reference or placeholder in SharePoint context to that SAP task.
Although this approach works, it has multiple architectural disadvantages:
  • It bypasses the standard Duet Enterprise workflow capability to update the SAP task via the SharePoint task.
  • It effectively duplicates some of the code and working of the internals of Duet Enterprise workflow update. As architect, I very much dislike any code duplication.
  • Per conceptual task update, 2 tasks must be explicit updated: first the SAP task, next the SharePoint task. This makes the task handling more complex.
  • There is a clear difference in the approval of SharePoint tasks that are created via Duet Enterprise workflow capability, versus regular SharePoint tasks (from any SharePoint workflow, e.g. document approval)
I favor the approach to handle all SharePoint tasks, including the ones that are instantiated as result of Duet Enterprise workflow publication, the same. And be ignorant of what happens in the SharePoint context and possible beyond as result of the task decision update. To achieve this, my SharePoint task update code is a variant on the ‘updateSharePointTask’ method of Ravi. And in this approach, I have no need for the ‘updateSAPTask’ method, as this responsibility is handled by the standard Duet Enterprise workflow capability.
There are 2 caveats when programmatically update Duet Enterprise tasks:
  1. Generic for SharePoint tasks: extendedproperties field ‘ows_taskstatus’ must have been set for the task update to be accepted. However, you cannot set this direct self, but set it indirect by this code snippet:
  2. Specific for Duet Enterprise tasks: In ItemUpdating eventreceiver, multiple checks are done to verify that it is indeed the intention to propagate to SAP. One of these checks is that a string-compare is done on both the set status text as status code. If this string-concattenation is not present in the Duet Enterprise task PossibleOutcomes (you specify this when you in SharePoint ‘Configure a new SAP workflow task type’), then the SharePoint task update succeeds, but the update is silently not propagated onto the connected SAP side. To make sure the programatically set status text is in sync with the configured PossibleOutcomes, I derive the status text based on that same configured PossibleOutcomes:
The complete code example to programmatically set the decision on SharePoint task, which via Duet Enterprise workflow capability then propagate the decision made to the connected SAP task:

Thursday, July 3, 2014

Tip - HowTo restore usage of SharePoint tokens in Visual Studio 2013 for webservice entities

I recently setup a new SharePoint 2013 development + demo environment, and a.o. installed the latest and greatest :-) Visual Studio 2013 edition. In a SharePoint 2013 project, I included some RESTful SharePoint services: WCF services, but also (due reuse) the older ASMX webservice version. Within both service variants I aim to use the convenient Visual Studio support for SharePoint design-time replaceable parameters, aka tokens, that will be replaced by Visual Studio at Solution packaging time with their concrete values.
Example of usage in a WCF service:
Service="TNV.VIEW2.Services.ISAPI.TasksWebProxy.TasksRemoteProxy, $SharePoint.Project.AssemblyFullName$"
Example of usage in an ASMX web service:
<%@ WebService Language="C#" Class="TNV.VIEW2.Dashboard.Services.VIEWProcessService, $SharePoint.Project.AssemblyFullName$" %>
However I noticed that after SharePoint solution deployment, the deployed .svc and .asmx files still contain the token instead of the actual assembly fullname.
The explanation is that Visual Studio applies configuration to determine in which Visual Studio project filetypes it must search for and replace the SharePoint tokens. And in its default configuration, Visual Studio 2013 does not include the .svc and .asmx filetypes (but only for the filetypes: xml, aspx, ascx, webpart, dwp, and bdcm). Akward decision if you ask me: WCF REST services are a valid SharePoint architecture option. And although you must doubt the same for .asmx web services, as Visual Studio users we may assume backwards compatibility on the support we receive from this tool.
The disrupted SharePoint development support can easily be restored by augmenting the Visual Studio configuration. You have 2 options here: do it on development system level, so that it applies at build time for all projects that are packaged on the system. Or include the configuration in the individual Visual Studio project(s).
I favor the last: a) this way, it will be effective on EVERY system that opens the Visual Studio project: on the development systems of your peer project members, and more important: on the (e.g. TFS) build server; and b) as for .asmx it is a valid default assumption that this is obsolete technology, it is justifiable that you must explicit and deliberate restore it for that filetype in only those Visual Studio projects in which you still utilize that older technology.
The steps to also include other than the default filetypes for Token replacement are:
  • Open the .csproj file in an editor (in Visual Studio, Notepad(+), …)
  • Locate the line: <SandboxedSolution>False</SandboxedSolution>
    Note that since the project deployes web services, it cannot be a Sandbox solution
  • And after that line, insert the following line: <TokenReplacementFileExtensions>asmx;svc</TokenReplacementFileExtensions>

Wednesday, July 2, 2014

Beware - malicious ScriptLink usage will hang up your SharePoint 2010/2013

SharePoint's ScriptLink is an useful class to include javascript resources within the HTML rendering. ScriptLink can be used declarative - in a masterpage, Visual WebPart, ... - and programmatically - code behind, webpart. But be aware, in case of incorrect usage, ScriptLink will effectively hang up your SharePoint site, both 2010 and 2013 versions!!.
An example of malicious usage is the following code, to include a javascript resource that is provisioned in (subfolder of) Style Library:
protected override void OnPreRender(EventArgs e) { base.OnPreRender(e); ScriptLink.Register(this.Page, "/Style Library/styles/view-core.js", false); }
The issue here is that ScriptLink assumes all relative links to be within the SharePoint layouts folder. The url in the example is runtime by ScriptLink converted into "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\LAYOUTS\Style Library\styles\view-core.js".
On itself, this incorrect derived url reference will not directly break your SharePoint site; merely disrupt the expected behavior in browser as the javascript would not be found and loaded. However, ScriptLink also server-side validates the calculated script url as a cache safe url, and in case it cannot be validated will end the complete SharePoint Page rendering, not only that of the erroneous SharePoint artifact. The result in browser is an empty/white page:
<script type="text/javascript"> var gearPage = document.getElementById('GearPage'); if(null != gearPage) { gearPage.parentNode.removeChild(gearPage); document.title = "Error"; } </script>
The correct way to include through ScriptLink a javascript resource administrated in (subfolder of) Style Library, is to use the '~sitecollection' keyword:
protected override void OnPreRender(EventArgs e) { base.OnPreRender(e); ScriptLink.Register( this.Page, "~sitecollection/Style Library/styles/view-core.js", false); }

Friday, June 27, 2014

Utilize Duet Enterprise 1.0 and 2.0 in parallel in enterprise architecture

Duet Enterprise supports migration scenarios in mixed SharePoint 2010 + 2013 landscape
Deployment of Duet Enterprise is a strategic addition to an organization’s existing infrastructure of SAP Business Suites and Microsoft SharePoint. One of the requirements from enterprise architecture perspective is that the investment in Duet Enterprise is future-proof. The utilization of Duet Enterprise for SAP-SharePoint interoperability must include a roadmap to follow-up on the innovations and developments in the SAP and SharePoint platforms. This is where Duet Enterprise 2.0 comes into place: the first version of Duet Enterprise supports SharePoint 2010, it’s successor supports SharePoint 2013. Duet Enterprise 2.0 is also backwards compatible, Duet Enterprise 1.0 services + BDC Models with Gateway Generic channel can be directly reused in SharePoint 2013 to interoperate via SAP NetWeaver Gateway with the SAP business suites.
In nowadays reality of many SharePoint using organizations, it is typical that for a time they have a mixed presence in their infrastructure of current SharePoint 2010 based applications, and new SharePoint 2013 developments plus migrations from SharePoint 2010. In such situations, enterprise architecture aims to utilize Duet Enterprise for SAP-SharePoint integration on both SharePoint versions, connecting to the same SAP business suites. SAP NetWeaver Gateway in its role of the central gateway to the SAP backend landscape, supports this enterprise architecture demand. The same single Gateway system can serve multiple consumers, including multiple SharePoint farms. The consequence is that investments made in Duet Enterprise 1.0 on SharePoint 2010 platform can be harvested when gradually migrating to SharePoint 2013 context with Duet Enterprise 2.0 deployed.
In our demo landscape we have this mixed landscape operational. On a SharePoint 2010 farm we have Duet Enterprise 1.0 based scenarios operational, and on a SharePoint 2013 farm the same Duet Enterprise 1.0 scenarios and also new Duet Enterprise 2.0 scenarios.
A prerequisite for this parallel SharePoint 2010 + Duet Enterprise 1.0 and SharePoint 2013 + Duet Enterprise 2.0 setup, is that the SharePoint farms must use the same SharePoint STS certificate. The reason for this is that in Duet Enterprise 1.0 scenarios, SAML is used for Single Sign-On handling. Both SharePoint 2010 as also SharePoint 2013 use the SharePoint STS certificate to sign the SAML assertion that is added to Duet Enterprise 1.0 requests originating through SharePoint Business Connectivity Service (BCS) application.
As our landscape is not a production system, I used a self-signed certificate for the SharePoint STS service applications in the 2010 plus 2013 farm. This saves us from the nuisance of having to renew the STS certificate each year. Something to be aware of is that in case of using a self-signed certificate on server A that is generated on another server B, it is also required to import on server A the internal root certificate of server B. This is required to enable server A to verify the full chain of certificates of the self-signed certificate generated on server B.

Thursday, June 19, 2014

SharePoint 2013 Search ‘InternalQueryErrorException’ in case of sorting on non-sorted property

After deployment plus configuration of an out-of-the-box component in our SharePoint 2013 farm, that component reported a fatal error when used on a SharePoint page:
Microsoft.Office.Server.Search.Query.InternalQueryErrorException: Search has encountered a problem that prevents results from being returned. If the issue persists, please contact your administrator...
As the logged information is very limited / useless, I analyzed the problem by executing the same code in an own Console Application, and via trail-and-error find out what exactly causes the problem.
The problem cause was this:
  • In a KeywordQuery, a managed property is added to the SortList parameter collection
  • However, that same managed property was not configured as Sortable in SharePoint Search administration
Executing the keywordquery via SearchExecutor.ExecuteQuery() results in the Search Service Application (SSA) throwing the 'InternalQueryException' due the mismatch in Search administration versus Search query/usage.
With this inner knowledge, the quick fix then is to set the managed property as ‘Sortable’ in SharePoint 2013 Search Administration.

Sunday, June 15, 2014

Tip: run 'New-SPODataConnectionSetting' as administrator

SharePoint 2013 Business Connectivity Service application also includes support for consumption of external REST OData services. The connectivity requires a ODataConnection. This can be created via the new PowerShell CmdLet New-SPODataConnectionSetting.
If you invoke this CmdLet under an 'ordinary' SharePoint account, you may encounter the error: The Web application at <ServiceContext URI> could not be found. Verify that you have typed the URL correctly. Typical cause is that the SharePoint account has insufficient rights to access the Central Admin web application. The simple resolution is then to run PowerShell via "Run as administrator".

Duet Enterprise 2.0 - installation troubleshooting

Initial, the installation of the first version (1.0) of Duet Enterprise was cumbersome and labour-intensive, and therefore also error prone. With the Duet Enterprise 1.0 installation wizard at SAP side, the installation already was substantially improved. And this has progressed on to the installation of the new version, Duet Enterprise 2.0. However, as the landscape is inherent complex (minimal one, but typically more SAP backends, a SAP Gateway system, and SharePoint 2013 farm), you still may face issues.
In this blog I describe some of the issues that I encountered during Duet Enterprise 2.0 installations, and share my resolutions.

Issue 1: ‘Add-on IW_DUETE Release 100 can only be installed in client 000’

Import of the Duet Enterprise 2.0 SAP Add-On via transaction SAINT gives the following error:
Resolution: Use client 000 to import the add-on via transaction SAINT.

Issue 2: ‘The remote certificate is invalid according to the validation procedure’.

This error can occur in multiple SharePoint-SAP scenarios: runtime invocation of Duet Enterprise service from SharePoint 2013 side, import of a Duet Enterprise 2.0 BDC Model: Application definition import failed. The following error occurred: The remote certificate is invalid according to the validation procedure.
Resolution: Typical the above error message is caused by mismatch of the SSL certificate used on SAP Gateway side to encrypt the traffic, versus the SAP SSL certificate that has been imported in SharePoint 2013 ‘Manage trust’. To repair, import the SSL certificate again. You can either ask again the SAP BASIS administrator to export from SAP Gateway STRUST. But a more convenient, and in my experiences also one with better results (was already so for Duet Enterprise 1.0) is to open on SharePoint web frontend the SAP service url, and then from SSL certificate warning export the SAP SSL certificate to file; and next add this to ‘Manage trust’.

Issue 3: ‘The root certificate that was just selected is invalid’

Importing in SharePoint 2013 ‘Manage Trust’ the SSL certificate of SAP NetWeaver Gateway results in error:
Resolution: import the SSL certificate via Internet Explorer (11) instead of Firefox.

Issue 4: ‘No service found for namespace /IWWRK/, name DUET_WORKFLOW_CORE

Manifest itself upon configuration of Duet Enterprise 2.0 Workflow solution.
Resolution: Add and activate the Duet Enterprise 2.0 Gateway REST service in MAINT_SERVICE.

Issue 5: DuetConfig: ‘The operation name is missing or invalid’

For instance occurs with the DuetConfig command to configure workflow, command with multiple operation parameters: DuetConfig.exe -importbdc -FeatureName Workflow -LsiUrl "https://tnvsrm.tnv.corp/sap/opu/odata/IWWRK/DUET_WORKFLOW_CORE;mo;c=SHAREPOINT_DE/" -BdcServiceApplication "Business Data Connectivity Service" -UserSubLsiUrl "https://tnvsrm.tnv.corp/sap/opu/odata/IWBEP/SUBSCRIPTIONMANAGEMENT; mo;v=2;c=SHAREPOINT_DE/"
Resolution: careful check the command line that the start character for each of the operation parameter names is indeed the minus (‘-‘) character. In case you copied the command line from the deployment guide, this may be ‘invisible’ the wrong character. To be sure, in the command line explicit (replace and) type the ‘-‘ character:

Issue 6: ‘Lobsystem (External System) returned authentication error'

For instance, occurs upon trying to import a Duet Enterprise 2.0 BDC Model into SharePoint 2013 Business Connectivity Services application:
Resolution 1: Check that the Duet Enterprise 2.0 generated ‘Duet Root Certificate Authority’ is added to SSL Server Standard in STRUST on the SAP Gateway system.
Resolution 2: Check the user mapping of the SharePoint account used to invoke the SAP Gateway service / import the Duet Enterprise 2.0 BDC Model. In particular check the extid that it contains the correct pattern (account name, comma + space, and then the domain name in small capitals):

Issue 7: Import of Duet Enterprise 1.0 BDC Model fails due missing ‘WSDL’ application definition.

Resolution: Add an “Duet Enterprise 1.0 WSDL” application definition in SharePoint 2013 Secure Store.

Issue 8: Invocation of Duet Enterprise 1.0 service fails due missing SSO

Browsing a SharePoint 2013 External List results in error ‘An unsecured or incorrectly secured fault was received from the other party’.
Inspect the SRT_UTIL ErrorLog:
Resolution: Besides the Duet Enterprise 2.0 SSO approach based on X.509 certificate, also enable Duet Enterprise 1.0 SAML2 approach. Use the Duet Enterprise 1.0 Deployment Guide for information how to enable SAML2 in the Duet Enterprise 2.0 (Gateway 2.0 + SharePoint 2013) landscape.

Epilogue

Compared to the excellent troubleshooting guide for Duet Enterprise 1.0, the above list of issues plus resolutions is much smaller. The Duet Enterprise product team of SAP plus Microsoft has clearly improved their delivery on this aspect.
Others that have been deploying Duet Enterprise 2.0 may have encountered other issues as I did so far. Still a general classification of all issues seen so far is that they are caused by manual error, indirect caused by the sometimes unclear, fragmented and even incorrect Duet Enterprise deployment guides.

Friday, June 6, 2014

Winner of SAP Microsoft Unite Partner Connection Customer Impact and Value Award with VIEW solution

I am very proud that The Next View has won the 2014 edition of the SAP Microsoft Unite Partner Connection Award for Customer Impact and Value with our VIEW solution!!
VIEW stands for Virtual Integrated Enterprise Workplace. For us, the VIEW concept is not new, I for instance already defined and published on this blog parts of the Conceptual Solution Architecture back in 2009. But it is only up to now with the advent of the modern integration technologies SAP NetWeaver Gateway, Duet Enterprise, Gateway for Microsoft (GWM), plus the availability of standard functional products from our partner Cordis Solutions, that we are enabled to actually realize the VIEW concept in a cost-effective manner.
So, what does VIEW stand for? VIEW is a new operating concept in which the central concept is that of an employee-centric mindset. In VIEW, we strive to optimally enable organization’s employees to perform their daily work-related activities. In current reality, this work execution often means that one must operate in (and switch to) multiple applications and systems, monitor multiple tasklists in different environments (SAP, SharePoint, Oracle, Outlook, ...), remember login credentials of the diverse systems. With VIEW, we relieve the employees from all this ‘IT landscape’ hashle. Instead that the employees must explicit go to all the different applications, in VIEW we collect all the work execution in a central place: the VIEW landing page.
And this VIEW landing page has multiple appearances: desktop and mobile, to fit in with the nowadays reality that employees are [willing to be] always connected to the business systems, to at minimum monitor and act on urgent matters.
Although the VIEW concept does not mandates this, the typical platform for the desktop appearance is SharePoint; as this is in majority of the organizations the declared [by Information Management, Enterprise Architecture] business webplatform. The VIEW landing page is merely added as new employee business application within the already present SharePoint-based intranet.
Also for the mobile appearance, SharePoint can be the platform [certainly SharePoint 2013 has made some big steps on enabling us to provide a proper mobile appearance, a.o. taking into account the diversity in mobile devices]. But the mobile landing page can also be hosted outside SharePoint, e.g. via SAP Mobile Platform [SMP], an hybrid App [HTML5/CSS/PhoneGap], and other alternatives. Again, the VIEW concept does not put strict restrictions on this.
If you want to learn more about the VIEW solution, check out the SAP Microsoft Unite Partner Collection solution brief.

Sunday, May 4, 2014

GWPAM renamed into SAP Gateway for Microsoft, short GWM

SAP NetWeaver Gateway Productivity Accelerator voor Microsoft’, that is a mouthful. This long name arose in several variations of how the product is actually being called in the market and press, resulting in confusion and making it difficult to find product information. SAP product management has acknowledged this drawback effect, and they now have a more catchy product name assigned: SAP Gateway for Microsoft, abbreviated as GWM.
Other GWM news besides this name change is the availability of GWM Service Pack 02. Key parts of SP2 are alignment with Visual Studio 2012 [modern UI] look and feel, support for Microsoft Office 2013, support of SAP Fiori services consumption, and a project template for building your own Excel Add-In. Earlier versions of GWM already include the capability to link an Excel sheet to a Gateway REST/OData service for (mass) data management. This feature is aimed for business people who arrange their own (master)data management via Excel. The new Excel Add-in template is designed specifically for the developer, to build your own innovative solutions with Excel UI platform.
See also: SAP CodeTalk: GWPAM Update / Interview with Holger Bruchelt

Saturday, May 3, 2014

Corrupt SharePoint account breaks Duet Enterprise workflow publishing

One of our Duet Enterprise customers where I had a.o. configured the workflow capability, requested my support as workflow tasks where no longer published from SAP workflow backend into SharePoint. I performed cause analysis in the combined SAP + SharePoint landscape, starting with inspecting the logs – SAP backend, SAP Gateway and SharePoint systems. The SAP Gateway log contained a recurring error log “logical port not found for routing url”.
I checked in SIMGH on the Gateway system, and found a valid routing url that referred to ‘LOGICALPORTFORWORKFLOW’; and in SOAMANAGER an active logical port with this name. But when I tried to ping the /IWTNG/CO_TASKFLOW_WEB_SERVICE consumer proxy, http error 405 was returned. The actual cause of the malfunctioning Workflow publishing from Gateway into SharePoint was the SharePoint service account that Gateway uses to authenticate against the Duet Enterprise webservice '/_vti_bin/OBAWorkflowService.asmx' on the SharePoint server. It had somehow become corrupted, and lacked the authorization to access the WSDL document of the Duet Enterprise service. After recreating/resetting the new service, account workflow publishing is working fine again.
Note: the logged message ‘No logical port found’ can be misleading; in the ABAP code it is the generic catch of whatever problem is encountered upon trying to propagate SAP task notifications from SAP Gateway to SharePoint.

Sunday, April 27, 2014

HowTo send JSON data from WCF service with special characters

The manner to return special characters within the JSON response of a (SharePoint-based) WCF REST service, consists of 3 essential elements:
  1. Define the method response as System.IO.Stream;
  2. Set the charset of the JSON response to 'ISO-8859-1';
  3. Apply UniCode encoding to the JSON response string.
Code example
[ServiceContract(Namespace = "...")] public interface IRESTService { [WebInvoke(Method = "GET", UriTemplate = "/RESTMethod?$parameter1={parameter1}&...", RequestFormat = WebMessageFormat.Json, ResponseFormat = WebMessageFormat.Json, BodyStyle = WebMessageBodyStyle.Bare)] System.IO.Stream RESTMethod(string parameter1, ...); } [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Required)] public class RESTService : IRESTService { public Stream RESTMethod(string parameter1, …) { OutgoingWebResponseContext context = WebOperationContext.Current.OutgoingResponse; context.ContentType = "application/json; charset=ISO-8859-1"; string jsonData = ...; return new MemoryStream(UnicodeEncoding.Default.GetBytes(jsonData)); } }

Monday, April 7, 2014

Service Application Proxies runtime reported with locale-dependent TypeName

The SharePoint architecture enables shared usage of service applications across farms. A typical setup is a Shared Services farm that hosts the service applications, and multiple consumer / front-end farms that host the webapplications. In the consumer farm(s), each individual webapplication is associated with the service applications it requires. For instance, webapplication A is associated with Secure Store and Business Connectivity Services, and webapplication B is associated with Secure Store and Secure Token service applications.
In a distributed SharePoint architecture you cannot programmatically access the service applications in the local farm. Instead you must access via the service application proxy that is associated with the webapplication. The retrieval model of this is weakly-typed, you retrieve the desired service application proxy by string-comparision (!) on the proxy TypeName. This weakly-typed usage model is errorprone; you can easily make a typo error that goes unnoticed at compile time. But you will be immediately aware upon the first runtime test of the code.
However, this weakly-typed model incorporates another strange behavior: the reported TypeName is locale dependent! In my local SharePoint image, I tested against an EN-US sitecollection to retrieve the BCS service proxy, filtering on TypeName:
BdcServiceApplicationProxy proxy = webApplication.ServiceApplicationProxyGroup.Proxies.SingleOrDefault( p => p.TypeName == "Business Data Connectivity Service Application Proxy" ) as BdcServiceApplicationProxy;  
With above code, I successfully retrieve the BCS service application proxy.
But the same code running against a webapplication in the integration-test farm does not select the BCS service application proxy. In Central Admin I verified that the webapplication is associated with the BCS service application. So what is the problem here? The cause rather surprised me: the sitecollection in the integration-test environment is provisioned via a Dutch-locale site definition. And as unexpected side-effect the TypeName of the associated service application proxies are now all reported in their Dutch localization name:
Fix for the above code is to make it locale-independent. For BCS this is possible by filtering on TypeName pattern ‘Business Connectivity Service’;
BdcServiceApplicationProxy proxy = webApplication.ServiceApplicationProxyGroup.Proxies.SingleOrDefault( p => p.TypeName.Contains("Business Data Connectivity") ) as BdcServiceApplicationProxy;  
For other service application proxies it might be required to compare the TypeName against the established Resources value:
private static string _BCSApplicationProxyTypeName = null; // Derived from Microsoft.SharePoint.CoreResource static string BCSApplicationProxyTypeName { get { if (String.IsNullOrEmpty(_BCSApplicationProxyTypeName)) { Assembly _aIntl = Assembly.Load("Microsoft.SharePoint.intl, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"); ResourceManager _BusinessDataRM = new ResourceManager("Microsoft.BusinessData.strings", _aIntl); _BCSApplicationProxyTypeName = _BusinessDataRM.GetString( "ApplicationRegistry_BdcServiceApplicationProxy_TypeName"); } return _BCSApplicationProxyTypeName; } } BdcServiceApplicationProxy proxy = webApplication.ServiceApplicationProxyGroup.Proxies.SingleOrDefault( p => p.TypeName == BCSUtility.BCSApplicationProxyTypeName ) as BdcServiceApplicationProxy;  

Sunday, April 6, 2014

Asynchronous invocation of BCS

Return early from an elapse-time-expense BCS handling

Functional scenario: from UI context, check the user input via serverside business validation, and if functional correct propagate the same user input for processing in the business backend.
In this scenario, the user needs only to wait for the period of the validation step; if the outcome reports functional errors these must be reported to the user so (s)he is able to correct. In case of no errors, the subsequent business processing can be performed transparent to the user, no need to explicitly wait / block until this (potential time-expensive) processing is completed in the backend.
Technical context: HTML5/JavaScript client, that interopates with SAP business backend via a SharePoint RESTful services, which internally utilizes SharePoint BCS to invoke Duet Enterprise / Gateway services.

Inconvenient operating BCS from background thread

Initially, I thought it would be simple to realize the sketched execution context: handle the first step on the main thread of the SharePoint WCF REST service; if errors detected report these back; if no errors noticed, delegate the elapse-time expense propagation step to a background thread, and directly return from the main thread to unlock the client.
However, SharePoint / BCS reality proofed less trivial…
In my first attempt, I directly interoperate against BCS on the background thread:
main thread:
PerformCreation job = new PerformCreation() { _createMethod = createMethod, _jsonBody = jsonBody, _siteId = SPContext.Current.Site.ID, _userToken = SPContext.Current.Site.UserToken }; Thread backgroundThread = new Thread(new ThreadStart(job.PerformCreate)); backgroundThread.Start();  
background thread:
public void PerformCreate() { using (SPSite site = new SPSite(_siteId, _userToken)) { BCSModuleCall.Create(site, _createMethod, _jsonBody); } }  
When I executed this, the BCS Create invocation faults with a ‘Cannot complete this action’ exception. This orginates from the BCS internal method call ‘CalculatePermissionsForCurrentThread’, that checks whether in the context of the executing thread, the user has the rights to perform the BCS operation. Via code reverse engineering, I detected that the 'Microsoft.SharePoint.BusinessData.Infrastructure. BdcAccessControlList.AccessCheck(BdcRights rights)' method requires an active SPRequest context. And that only lives on the main / SharePoint thread.
To arrange that the BCS Create method is executed within the context of an active SPRequest, in my second attempt I changed the code to serverside issue from the background thread a webrequest to the SharePoint REST service. Although this technically works (that is, the ‘CalculatePermissionsForCurrentThread’ succeeds and the BCS Create operation is successful performed), a drawback is that you loose the credentials of the logged-on SharePoint user. It is not possible to propagate the claims-authenticated identity of the logged-on user to the webrequest invoked via WebClient class. It always authenticates and thus executes with the Application Pool identity.
public void PerformCreate() { var claimIdentity = (Microsoft.IdentityModel.Claims.ClaimsIdentity) Thread.CurrentPrincipal.Identity; var upnClaim = claimIdentity.Claims.FirstOrDefault( c => c.ClaimType.Equals(Microsoft.IdentityModel.Claims.ClaimTypes.Upn, StringComparison.InvariantCultureIgnoreCase)); if (upnClaim != null) { string upn = upnClaim.Value; WindowsIdentity windowsIdentity = Microsoft.IdentityModel.WindowsTokenService.S4UClient.UpnLogon(upn); var wic = windowsIdentity.Impersonate(); try { // SharePoint BCS API requires a current SPContext. In this thread, // there is no SPContext. To make sure the create is executed within // a current SPContext, create the entities by issuing REST request. WebClient webClient = new WebClient() { UseDefaultCredentials = true, BaseAddress = _baseAddress }; webClient.Headers.Add(HttpRequestHeader.ContentType, "application/json;charset=utf-8"); webClient.UploadDataAsync( (new Uri(String.Format("{0}/CreateItems", _baseAddress))), "POST", Encoding.UTF8.GetBytes(_jsonBody)); } finally { wic.Undo(); } } }  
In our scenario, it is a necessity that all data updates are authorized and audited for the issueing user. So although technical it is possible to delegate in this manner the BCS data update to a serverside background thread, from the perspective of data governance it is not allowed.

Solution: client-side operate BCS in asynchronous mode

Ultimately, I skipped the approach of serverside splitting up the BCS invocation in a synchronous blocking part, and an asynchronous non-blocking part, and instead decided to manage this from client-side. At first, issue a REST JSON request for the data validation; and let the user wait (block) on the response. If no validation errors detected, issue from the browser with the same JSON object the second update request in a fire-and-forget behaviour.
var checkUrl = bindingContext.$root.siteUrl + "/_vti_bin/WebApi/BCSAccessService.svc/CheckItems"; var checkCall = jQuery.ajax({ url: checkUrl, type: "POST", data: JSON.stringify(json), success: function (data) { if (Utilities.CheckNoErrorInResponse(data.BODY.ET_MESSAGES)) { var submitUrl = bindingContext.$root.siteUrl + "/_vti_bin/WebApi/BCSAccessService.svc/CreateItems"; // Data is already validated: //Fire-and-forget ajax async request and ignore response. var submitCall = jQuery.ajax({ url: submitUrl, type: "POST", data:JSON.stringify(json)}); alert("User input is validated and submitted."); window.parent.jQuery('#dialogForm').dialog('close'); } else { Utilities.DisplayResponseMessages(data.BODY.ET_MESSAGES); } } }).fail(Utilities.ErrorHandler);  
And this works out fine: the end-user earlier regains the input-control and responsiveness in the front-end application, while the update in the back-end is still processed in the background with the credentials of the logged-on user.

Friday, March 14, 2014

Use JSOM from a plain webpage

With the richness of available javascript libraries (knockout.js, angular.js, …) combined with SharePoint JSOM, you can nowadays build some real nice functionality without need of server side code. Drawback however seemed that to be able to use JSOM, the script needs to be included on a SharePoint page. In some cases this is fine, when you utilize JSOM/Knockout/Angular to enrich a SharePoint page (sitepage, publishing page) with dynamic clientside behavior. But there are also scenarios that the clientside behavior is on its own (e.g. functionality exposed to the user via a dialog), and then it is undesired to include the SharePoint overhead: bloated HTML and an heaver page payload.
So the question is then, can you JSOM outside the context of a SharePoint page? Seems evident that it must be possible, after all it boils down to identifying the right resources to send to the browser, and that then does the work. However, it appeared to be a little less easy and logical to setup such a context. And the Microsoft provided information is insufficient and unclear.
After some trial-and-error, I now have it working. The required elements in the setup are the following:
  1. The page must be rendered by SharePoint; can be from the layouts directory (note: not as an application page), or from a SharePoint Document Library
  2. The page cannot be a plain .html; as it is required to utilize SharePoint server rendering pipeline
  3. Include a SharePoint.ScriptLink to deferred load 'sp.js'
  4. First trick: include '<SharePoint:FormDigest runat="server"></SharePoint:FormDigest>' in your page
  5. Second trick: encapsulate the javascript code that invokes JSOM, within a '<form runat="server"> -- </form>'. Especially this aspect took me some trial/discovery time.
With these 5 aspects applied, SharePoint JSOM can be used from a lean and lightweight webpage, without the SharePoint master page overhead.
Example:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<%@ Page Language="C#" %>
<%@ Register Tagprefix="SharePoint" Namespace="Microsoft.SharePoint.WebControls"
  Assembly="Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>
<html dir="ltr" xmlns="http://www.w3.org/1999/xhtml">
<head runat="server">
  <meta name="WebPartPageExpansion" content="full" />
  <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
  <title>Plain JSOM usage</title>
  <SharePoint:ScriptLink language="javascript" name="sp.js" OnDemand="false" runat="server" Localizable="false" LoadAfterUI="true"/>
  <SharePoint:FormDigest ID="FormDigest1" runat="server"></SharePoint:FormDigest>
 </head>
<body> 
  <form id="form1" runat="server">
     <script language="ecmascript" type="text/ecmascript">                                    
         $(document).ready(function () {
             EnsureScriptFunc("sp.js", "SP.ClientContext", function () {
                 var ctx = new SP.ClientContext();
                 var site = ctx.get_site();
                 ctx.load(site, 'Url');
                 ctx.executeQueryAsync(function (s, a) {
                     ko.applyBindings(new Demo.AppViewModel.ProcessesVM(site.get_url()), document.getElementById('DemoUsage'));
                 });
              });
         });
    </script>
 </form>
</body>