Saturday, November 19, 2011

Alternative ways to programmatic read contents of External List with filtered view

In order to export the displayed contents of a BCS External List to Excel (see previous post), I first have to programmatically retrieve the contents in custom code. The External List has a filter applied to it's Finder method, set via the default SPView:
My first thought was therefore that below code should retrieve the same filtered external data as when rendering the BCS External List on a SharePoint page:
However, GetItems returns an empty collection. Upon debugging the called GetNotifications webservice method, I discovered that the filter-param always has value ‘null’. To me unclear why, since it has been set in the DefaultView.
However, although this approach works; it left me rather dissatisfied. Conceptually I want to export the contents of the same (External) List that I already have provisioned on a SharePoint page; so why should I have to dive under the BCS hood to get the same contents? Also, this code is very much aware / coupled to BCS API, while the 'Export SPList to Excel' functionality on itself is general. Thus, with some ample time available, I decided to analyze the way in which the standard XsltViewWebPart is issuing the external data retrieval – using JetBrains DotPeek reflector tool (as .NET Reflector is no longer free of licensee charge). It appears that this is done slightly different as my original attempt:
Not only is the code LOC of this far less, but also this code is general; applicable for both regular SharePoint lists as BCS External Lists.

Friday, November 18, 2011

Excel 2010 Protected View hinders browser-opening of downloaded .xlsx file

An user requirement in one of our SharePoint 2010 projects is to export at any moment the displayed contents of an External List (with content originating from SAP ERP, retrieved via SharePoint BCS connecting to BAPI based web services) to an offline file. The functional rationale is version-administration for history and auditing purposes. The SharePoint platform supports this out-of-the-box for regular Lists, by the Export into Excel functionality. However, not so for BCS External Lists. But you can realize it yourself via some custom code. First retrieve the External List contents, and next construct a .xlsx file via Open XML SDK. The .xlsx file is generated server-side in memory, and send to the browser as HttpResponse content. The end-user can next either open the file, or save it somewhere at client-side:
Strange thing I noticed was that when saving the file, that saved file can next be opened successfully. But when instead choose to directly open the file, Excel 2010 displays the error message “The file is corrupt and cannot be opened”.
This must be a client-side issue; the server-side is not aware of the context in which the client-side handles the received HttpResponse (Note: via Fiddler I even analyzed that the HttpResponse contents were identical).

The resolution is hinted at in the File Download window, by the trust-warning about internet downloaded files. The default Excel 2010 TrustSettings are to distrust all downloaded content from non-trusted locations. To validate this I unchecked in Excel 2010 the default settings (via File \ Options \ TrustCenter \ TrustCenter Settings \ Protected View):
This helps, Excel 2010 now direct opens the downloaded file.

Thursday, November 10, 2011

Inconvenient configuration of Forms-Based Authentication in SharePoint 2010

For an extranet we aim to utilize Forms-Based Authentication to authenticate external users. In SharePoint 2010 this means that you have to apply the Claims-Based authentication model. And next set up the SharePoint webapplication configuration for the Membership provider that will be used for FBA. This is where things can be a little confusing. In fact you have to configure FBA membership on 3 different locations, and it is essential that all 3 are in sync:

1. In Central Admin

Configure Web Application \ Authentication Providers; here you specify the name of the Membership- and RoleProviders used in FBA context within the web application

2. Security Token context

In the SharePoint 2010 service applications architecture, membership handling is delegated to the SecurityToken service application. This is a major difference wrt SharePoint 2007 architecture, in which the individual webapplication process themselves handle the membership handling. Direct consequence for configuration is that you need to specify in the web.config of the SecurityToken service application all the membershipproviders and roleproviders that are used in the SharePoint farm.
The SecurityToken service application directory is located at: 14hive\WebServices\SecurityToken

3. Webapplication context

Finally, despite that membership handling is within SharePoint 2010 done via the independent SecurityToken service application; you may still need to add the membershipprovider to the web.config of the individual webapplication. This is required in case you want to be able to use the standard PeoplePicker for selecting credentials via the FBA membershipprovider. Besides adding the ‘membershipprovider’ node, you then also have to set the peoplepicker directing to that membership provider.
What if the 3 locations are not in sync? I evaluated the different inconsistent configurations:

1. Name entered in Central Admin does not match with name in SecureToken’s web.config

In this case, SharePoint Identity handling cannot get an handle to the configured MembershipProvider. When someone tries to log in via Forms-Based Authentication, SharePoint Identity handling will report the displayed exception.
Note, to have the 'Cannot get Membership Provider' exception details displayed you already need to make a change to the default settings within SecurityToken web.config. Namely set attribute includeExceptionDetailInFaults of ServiceDebug to true. Without this, only a general error is displayed: The server was unable to process the request due to an internal error

2. Missing MembershipProvider in Central Admin

Well, you cannot actually forget to fill them in if you selected ‘Enable Forms Based Authentication’. But you could per accident forget to select that option.
The result of this is more a functional error: the forms-based authentication is now missing as option to logon to the web application.

3. Missing or different named MembershipProvider in web application’s web.config

In this case you can still logon to the webapplication: either via Windows Authentication or via FBA. See above, membership is namely handled by SecurityToken service application, and not by the webapplication self. This can be a bit confusing at first. The result of the configuration error is noticed within the application self, when you try to search credentials from the FBA-Membership provider via the PeoplePicker. Strange enough the PeoplePicker is aware of the configured FBA-membership, but apparently cannot include it in it’s search space.

4. Mismatch PeoplePicker (incorrect) versus named MembershipProvider in webapplication’s web.config

In this case strangely enough, PeoplePicker is able to use the Membership provider, that is if it is consistently named wrt SecurityToken web.config (otherwise issue 1).
At first I could not detect any PeoplePicker malfunction as result of this configuration mismatch. I could still find all credentials in the membershipprovider, also via wildcarding.
Only when I on purpose sabotaged the wildcarding, I could see the effect.
So it appears that for wildcard search the ‘%’ is already somewhere set as default for all Membership providers; and that you only have to override this in case your Membership provider uses a different wildcard-pattern.
Personally, I find it better to always be explicit; and thus also explicitly specify ‘%’ as wildcharacter for each Membership provider that is valid for your application. But it is optional, and others might differ with me on this…

Friday, November 4, 2011

Programatically open SPSite using Windows Credentials

In a Proof of Concept I employ a SPListMembershipProvider for forms-based access into (sub)sites. For the PoC, a SPList in the rootweb is utilized as User Administration. In SharePoint 2010 architecture, Claims-Based authentication is handled by SecureToken service application. In the local development image, the STS service application may run under the same Application Pool account as the SharePoint webapplication. But this is not recommended, thus not to be expected within a real farm setup. As result, it is not possible to directly access from within runtime STS context the list in the external SPSite.
SPSecurity.RunWithElevatedPrivileges cannot help here; that only be used within the same application pool context. Instead the proper way is to open the SPSite in the external SPWebApplication via the credentials of a SPUser in that site; e.g. that of a service account. Problem is that the SharePoint API does not directly provide a way to open a SPSite with Windows Credentials. You can open a site under the credentials of a SharePoint user, but you need the SPUserToken of the user for this. And guess what, you can only determine that token when within the context of the site. Talking about a chicken-egg situation.
However I came up with a manner to get out of this loop. It consists of a 2-steps approach: first programmatically impersonate under the credentials of the service account, open the site, determine the SPUserToken of the site’s SystemAccount, and undo the impersonation; second apply the SPUserToken to (re)open the site under the authorization of the site’s SystemAccount. Since Windows Impersonation is a resource intensive operation, cache the SPUserToken in memory so that the impersonation is only initially required within the process lifetime.

internal static SPUserToken SystemTokenOfSite(Guid siteId)
{
    string account, pw, domain;
    RetrieveCredentialsFromSecureStore(<AppId>, out domain, out account, out pw);

    ImpersonationHelper _impersonator = new ImpersonationHelper(account, domain, pw);
    try {
        _impersonator.Impersonate();

        SPSite initialAccessIntoSite = new SPSite(siteId);
        return initialAccessIntoSite.SystemAccount.UserToken;
    } finally {
        _impersonator.Undo();
    }
}

...
if (sysToken == null)
{
    sysToken = SecurityUtils.SystemTokenOfSite(memberSitesToId[websiteIdent]);
}

SPSite site = new SPSite(memberSitesToId[websiteIdent], sysToken);

Monday, October 24, 2011

Data architecture considerations and guidelines for retrieving SAP data via Duet Enterprise into SharePoint

In Duet Enterprise FP1 it is relatively easy to generate a Gateway Model, as long as your SAP data sources live up to the required constraints. The challenge and intellectual work transfers to the data architecture, for deciding on and defining the proper data service interfaces given the application context.
This blog is earlier published on SAP Community Network Blogs

Real-time retrieval versus Search indexing

Bringing SAP data into SharePoint based front-ends can take different forms. You can unlock the SAP data real-time via ExternalList, to display and even edit the SAP data via the familiar SharePoint List UI metaphor. If the row-based List format doesn’t fit, you can develop a custom SharePoint webpart. You can also access the SAP data via the extensive SharePoint Search architecture and capabilities. With the improved Enterprise Search in SharePoint 2010 [or FAST if also purchased by customer organization], more and more new applications will be architected as search-driven. In context of SAP data think of searching from SharePoint context for a certain Supplier administrated in PLM, a Customer in CRM, et cetera.
Duet Enterprise can facilitate all of the above scenarios, in combination with the strengths of the SAP and SharePoint landscapes. However, the different nature of these scenarios may well result into different approaches for the data integration architecture. In the case of real-time retrieval of SAP data to render into SharePoint, it is advisable to limit the amount of data retrieved per SharePoint - Gateway - SAP backend interoperability flow. Don’t put unnecessary SAP data on the line that is not being used in the front-end: only include the fields of the SAP data that will be displayed in the UI and are relevant in this application context for the end-user. Search however has a different context. It is feasible to search on any of the field data of the SAP entity. This requires that all that SAP field data must be indexed upon SharePoint Search crawling time. So here it is advisable to not limit the amount of SAP data retrieved, but instead retrieve as much of the SAP data that expresses some functional value.
What if the same SAP data is to be unlocked both via SharePoint ExternalList, as via SharePoint Enterprise Search? The data integration architectures of these are thus contra dictionary: reduce the retrieved SAP data versus give me all. Well, nothing prohibits you from constructing multiple data integration pipelines, tuned for the different scenarios. For Search, it’s Gateway Model returns in the Query method all fields that contain functional value. And for the real-time ExternalList, that Gateway Model returns in the Query method only those fields that will be rendered as list columns.
In Duet Enterprise 1.0, constructing the Gateway Model takes considerable time. This is an obstacle for constructing multiple Gateway Models with different data representation / signature, as it can easily double your Gateway mapping effort and time. Luckily in Duet Enterprise FP1 it is relatively easy to generate a Gateway Model, as long as your SAP landscape data sources live up to the required constraints.

Complex SAP data source structure

If the SAP data is conceptually a flat structure, it is sufficient to have a single Gateway Model to retrieve the data into SharePoint context. However, we all know that SAP (or, business) data is typically of a more complex structure: hierarchical with multiple child entities. When you want to retrieve such a structure into SharePoint, you basically have 2 options.
The first is to flatten the structure. This approach suffers from some disadvantages. In case of multiple occurrences of the same child type (e.g. Ordered Item); how many of them should you include in the flattened representation? All, or an arbitrary limited number? Also, for the end-user a flattened structure can be conceptually wrong and counter-intuitive: Ordered Item information is of different functional level as the Purchase Order information.
The second option is to maintain the hierarchical SAP structure within the Gateway Model architecture. For this you need to generate a Gateway Model for the parent SAP data entity, as for each type of its child data entities. At SharePoint client-side you associate the 2 resulting External ContentTypes, to establish the parent-child relation. SharePoint BCS respects the association between the External ContentTypes when it operates on the data. In case of SharePoint search, the child-associations are crawled in the context of the parent data, and either the parent or child SAP data entity will appear in search results. The BCS Profile page of a parent data entity renders also the data of its child entities.
In case of real-time retrieval, you can apply the BCS Business Data webparts: add to the SharePoint page both a Business Data WebPart for the parent entity and a Business Data Related List WebPart for the child data entities. The latter will display the child SAP data entities of the parent SAP data entity that is selected in the Business Data List web part.
Mind you, the user experience of this setup is not always intuitive. In such case, it can prove better to build an own custom presentation. An example of this is outlined in Working with complex SAP business entities in Duet Enterprise.

Sunday, October 16, 2011

Unlocking SAP data via Duet Enterprise Feature Pack 1 in more agile approach

This blog is earlier published on SAP Community Network Blogs
A major bottleneck in applying Duet Enterprise 1.0 is the time it takes the SAP + Microsoft development team to unlock SAP data via Gateway to SharePoint. You have to define the service interface in ES Builder, create a proxy in transaction SPROXY, and next via transaction SE38 realize the GenIL model. The latter requires a lot of handcrafting ABAP code to do the mapping from Gateway runtime context and data representation to SAP backend, and vice versa. Code that follows a pattern, so a good candidate for code generation. This was a major pain point experienced by us when initially applying Duet Enterprise 1.0 within the Ramp-Up in 2010, and with emphasis reported back to the Duet Enterprise product team.
In the coming Feature Pack 1 version, the Duet Enterprise product team has evidently got the message. FP1 comes with multiple generator tools to ease and speed-up the realization of the internal Gateway Model (new name for GenIL model) for your application scenarios. Handcrafting is largely eliminated and replaced by full-automatic generation of first the mapping code to unlock the SAP data via NetWeaver Gateway 2.0, and next the required SAP Gateway service proxy, plus the SharePoint BDC model. The latter can be handed over to SharePoint side to import the External ContentType definition into Business Connectivity Services. Something that took us with version 1.0 several days to set things up at both SAP as SharePoint side, now is done in matters of minutes. Very appreciated side-effect of this is that it enables agile development: if the initial Model does not fit the requested application context, just change the mapping in the tooling and regenerate. With handcrafting approach you would loose a lot of elapse time here rearranging and testing your own mapping code.
Of course not all now suddenly comes for free. Before you start the Gateway Model generation, you still first must think about the data integration architecture to achieve your application scenario. An action that involves and requires consensus of both the SharePoint and SAP backend architects plus developers. The usage of the FP1 / Gateway 2.0 generation tools itself put some constraints on the SAP backend data sources; BOR, RFC or Dynpro Screens. Basically it comes down to it that the backend data entities must provide at minimum both a ‘Query’ and ‘ReadItem’ operation, and also ‘Create/Update/Delete’ operations for update scenario via SharePoint.
What if the available SAP backend entities do not self satisfy the requested integration pattern and/or the generation constraints? Even then, it is far easier and better manageable to realize a custom wrapper RFC within the ERP level that does satisfy the pattern + constraints, than apply the manual Gateway Model code crafting. Spoken from own experiences…

Friday, October 7, 2011

Read content of uploaded file within ItemAdding method

In an application it is required to validate the file content before allowing the upload into a document library. SharePoint 2010 enables this via the ItemAdding synchronous SPItemEventReceiver method. Problem is however that you cannot access the uploaded file through the SPItemEventProperties.ListItem object. The item is at runtime of ItemAdding not yet created in and thus not available via the list. Via blogpost Getting file content from the ItemAdding method of SPItemEventReceiver I found a code-snippet how to access the file. Last remaining issue was that I received empty result upon reading from the filestream. Debugging I discovered that the Stream position was set to the end of file. Which is logical since the file has been read in order to save it to the content database. By resetting the position I can read the file contents, validate it, and cancel the upload in case of invalid content
public override void ItemAdding(SPItemEventProperties properties)
{
  string searchForFileName = Path.GetFileName(properties.BeforeUrl);
  HttpFileCollection collection = _context.Request.Files;
  for (int i = 0; i < collection.Count; i++)
  {
    HttpPostedFile postedFile = collection[i];
    if (searchForFileName.Equals(
      Path.GetFileName(postedFile.FileName), StringComparison.OrdinalIgnoreCase))
    {
      Stream fileStream = postedFile.InputStream;
      fileStream.Position = 0;
      byte[] fileContents = new byte[postedFile.ContentLength];
      fileStream.Read(fileContents, 0, postedFile.ContentLength);
      ...