Wednesday, January 19, 2011

Development Approach with Duet Enterprise

This blog is, with some minor changes, earlier published on SAP Community Network Blogs

TopForce participated in the 2nd half of 2010 within the Duet Enterprise Rapid Deployment Program. Together with our RDP partner, we followed a well-chosen approach to evaluate this SAP – SharePoint integration product on it’s capabilities and potential.
In the Ramp-Up / RDP I was involved in plotting the Duet Enterprise prerequisites in the company IT infrastructure, selecting the PoC scenario, architecting the integration approach, and configuration + realization of the Duet Enterprise services for the PoC scenario. In this article I highlight the main points of our journey.

Product evaluation via a custom Proof-of-Concept

Both partners in the RDP regarded it as essential to evaluate Duet Enterprise as interoperability product via a custom Proof-of-Concept. Only such we are confronted wih actual experiences and issues in the application and potential of the Duet Enterprise capabilities. We do not suffice with merely a rather simple out-of-the-box scenario provided by Duet Enterprise.

Selection criteria for the PoC scenario

For us the main reason to consider Duet Enterprise is as integration foundation for company-specific SAP / Microsoft interoperability. To best evaluate the product capabilities on this aspect, we defined the selection criteria for the PoC scenario. We weighted the possible scenarios on the following aspects:
  1. A real application context, with current and/or future purpose
  2. Give input to SAP / Microsoft integration design guidelines

Validation approach

The PoC is intentionally approached as a regular development project. Thus functional specification, architectural and technical design, realization and testing, with ultimate ending in product implementation. In these Agile times, the phases are not necessarily in linear order; especially the integration architecture and realization have had some iterations as results of lessons learned.

Functional specification

In this phase, the application functionality is clearly defined, and agreed plus functional validated with the customer + end-users. Noticeable is that in this phase special attention is / should be given to the UX-factor: the User Experience of the application. This is because the most heard compliant against SAP is typically the lesser user friendliness. And also because nowadays information workers expect no less than a pleasant and familiar workspace, in which one is facilitated and helped to do your daily and also occasional work activities.

Integration and Software Architecture Design

Essential in this phase is close cooperation between the SAP and Microsoft solution architects. Both typically come from different backgrounds (‘different worlds’, almost like Mars versus Venus), yet it is required that they have a sufficient common understanding to architect a proper and future-proof design.

In the architecture itself, a best practice is to apply a layered service integration architecture. Each layer has its own responsibilities. In a nutshell, SharePoint sits at the front-end / presentation layer, SAP at the back-end layer; and Duet Enterprise is the glue as integration layer.

The integration layer is derived from the [use cases of the] functional specification. One of the design principles applied is that the SAP backend is and remains ultimate responsible for correctness of the data. This also implies that business rules are enforced and remain the responsibility of the SAP backend.

Duet Enterprise also imposes some constraints on the service interfaces. At conceptual level this means that Duet Enterprise [currently] only supports data oriented interfaces, operating via CRUD behavior. The explanation is that Duet Enterprise utilizes Business Connectivity Services to exchange data from SAP to and from SharePoint. BCS is a strict data oriented capability, it is not usable for [SAP] process oriented control.

Development Process

Once the service interfaces are derived and defined, the development team can go ahead with implementing the SAP-SharePoint integration via Duet Enterprise. The outline of this development process is as follows:
  1. Configure / define the derived interfaces as SAP Enterprise Services in the ESR, using the SAP ES Builder. Hereby you must augment the interfaces as defined with 2 additional parameters required for proper Duet Enterprise handling, namely:
    • Correlation ID; this is used for the end-2-end monitoring of entire runtime flow through the SAP and Microsoft landscapes
    • Business Object Instance Key; this is used as generic identifier in all SCL framework components at runtime and at designtime. It consists of 3 components, Data Value Identier, SCL Business Object Name and the System Alias of the Backend. The key is to be generated in the SCL processing of each SCL service response, in the following structure: <data_value_identifier>_<Business Object Name>_<System Alias Of Backend>. The usage of this key is enable the SCL in subsequent handling to transparently identify the correct backend in a SAP landscape with possible multiple system instances. Notice the implication that if your SAP landscape consists of only a single backend, there is no concrete runtime usage of this field; all your SCL requests will be routed to the single backend. In that case you can simplify the key structure.
  2. Export the defined SAP Enterprise Services in WSDL format
  3. Hand over the WSDL file to both your SAP developer(s) as SharePoint developer(s). From here, both can in parallel configure and/or build their own side.
SAP NetWeaver 7.02 / Service Consumption Layer
  1. In the SCL system, using ABAP Workbench / transaction SPROXY, create service proxy. This generates the skeleton of the provider class that includes the methods specified in the service interface. Later on you will implement the operations which involves some ABAP programming. The explanation of this break-up is that in the construction of the operations you need the GenIL model; and that you do not have yet generated in this stadium.
  2. In the SCL system, in transaction se38 create the GenIL model based on the request and response structure of the service proxy. This generates a generic SCL Business Object Model. The reason that it is named ‘generic’ is because it is an intermediate mapping layer between the consumer (SharePoint BCS) specific data structures and the backend (e.g. SAP ECC 6.0) specific part. This enables that the consumer can transparently communicate with different backend systems/versions. That can be simultaneous in case of a distributed SAP landscape, consisting of a multitude of variant backends (e.g. ECC 6.0 for North America, while ECC 4.26 for the Dutch company division). But it is also beneficial when you upgrade your single system SAP environment – the GenIL model shields the client side from such backend environment change. And this also holds in the other direction – via the same GenIL model other clients as SharePoint can in theory also consume the exposed SAP data and functionality. An example is Alloy for IBM WebSphere.
    The GenIL model is actually to be more regarded as a generic Gateway derivative than strict Duet Enterprise specific
  3. In the SCL via transaction SIMGH, create per service operation a Backend Operation Proxy that invokes the SAP Backend capability: either a SAP Enterprise Service, BAPI WebService or RFC / Function Module.
  4. In the SCL, via transaction se80 create per service operation a Mapper object that implements 2 methods with predefined signatures for inbound and outbond mapping. This involves ABAP code to do the mapping from the GenIL model to the backend specific data format, and vice versa.
  5. In the SCL / ABAP Workbench, return to the generated Service Proxy and fill in each of its generated operations. This also involves ABAP code to do the mapping from the received input to the GenIL model, and from the GenIL model to Duet Enterprise service output.
  6. In the SCL via transaction SOAMANAGER, create the endpoint for the service proxy.
SharePoint 2010
  1. Start SharePoint Designer to generate External ContentType(s) on basis of the imported WSDL from the SAP Enterprise Services.
  2. If applicable, generate SharePoint External List as User Interface metaphore. Whether applicable is determined by a) the data structure; External List requires a flat structure (thus no hierarchical / parent-child relationships); b) the User Experience desired by the end-user.
  3. if the External List is not applicable, realize in Visual Studio a custom build SharePoint webpart. Hereby you can use the whole richness of available ASP.Net and SharePoint webcontrols – datagrids, calendar, richtextbox, dropdownlist, … Also, SharePoint 2010 allows InfoPath forms for custom UI.
  4. Configure the BCS model to connect for this External ContentType to the endpoint in the SCL of the Duet Enterprise specific service.

Conclusion

Introduction and application of Duet Enterprise is not a free ride. It takes time to get familiar with its concepts, and the development approach + tooling add-ons in the ABAP Workbench and SharePoint in the form of BCS Model Generator. It also requires preparation time at front in defining and architecting an application design. But that’s no different from any other software development project. Given that, Duet Enterprise is a noteworthy addition to the SAP / Microsoft interoperability toolbox; and I recommend to consider it for applicability in your specific SAP/MS integration scenario.

Friday, January 7, 2011

CQWP, CSS and jQuery applied to rearrange groups of overview links

A customer requirement for their SharePoint intranet is a reusable 'Overview-of-Handy-Links' capability, with the following specification:
  1. Contributors must have a means to manage the overview links. The functional mananagment must be easy, without the need to maintain HTML.
  2. The links must be grouped
  3. The display-order of the overview groups must be controllable
  4. The display-order of the links in a group must be controllable
  5. The display space must be optimable used, white-spacing must be minimized
  6. The overview backgroundcolor, links color and links hover color must be settable
  7. The number of columns must be settable
  8. Minimize the amount of custom code, favour the usage of standard SharePoint capabilities
As consequence of the first requirement, application of the ContentEditorWebPart is not an option. That would mean that the contributors are directly confronted with the html-specification of the overview. Utilization of the ContentQueryWebPart appears more promising. The rough solution design is then to 1) maintain the groupheaders in 1 SharePoint SPList; 2) maintain the overview links in another SharePoint SPList, with a Lookup to the GroupHeaders SPList; 3) Connect a ContentQueryWebPart to the OverviewLinks SPList; 4) Group on the GroupHeader Lookup field/column; 5) Codify in the CQWP Main, Group and Item xslt-files the templates to deliver the appropriate html.
The requirement to optimal use the display space can for a large extend be achieved via CSS float-functionality: automatically 'glue' the individual Group DIV's next to each other. In the horizontal direction there is then no lost white-spacing. However, in the vertical direction that cannot be achieved with CSS-float alone. Float flows in horizontal direction, not vertical.
But there is jQuery to the rescue. The basic idea is to first let the CSS-floating capability align the set of DIV's in as much as possible optimal usage of the display space. And afterwards, perform some minor rearrangement to also reduce the lost of white-spacing in vertical direction.
Via CQWPAfter jQuery

Reusability of the OverviewLinks-functionality is achieved by applying the Feature capability. The Feature provisions the required SharePoint artifacts: the SharePoint SPLists - including the Lookup relation between them, a preconfigured CQWP, the XSLt-files.

Tuesday, January 4, 2011

Solution partitioning and dependencies with mixed scopes causes farm-deployment trouble

We probably all agree that it is wise to partition your SharePoint application deliverables and deployment into multiple packages. Typical examples are a Base assembly, deployed via a base wsp, providing a general foundation layer with diverse technical and functional building blocks. And then 1 or more functional packages, that provision the SharePoint artifacts and code for a specific application functionality; in which the functional building blocks may use and rely on parts provided via the Base assembly.
However, sometimes this architectural sane decision to partition your application deployment, can result in unforeseen effect. Let me demonstrate via a real-life example, that I encountered today:
  • Application deployment deliverable
    1. includes a Base assembly, deployed via solution package Base.wsp. In this wsp package included the Base code, some basic Features, JobDefinitions, generic webparts. In the Base code / assembly there are FeatureReceivers, WebParts. As result of this, SharePoint solution generation (WSPBuilder or VS2010) puts SafeControl settings per contained WebPart in the Solution manifest; which again results in it that the scope of the Solution package is WebApplication.
    2. and also contains a pure functional package; with some Features, SiteTemplates. This functional package does not contain an assembly itself, but instead relies on the assembly of the Base.wsp for reused FeatureReceivers. As result this solution package does not contain any artifacts with are scoped at WebApplication level, and SharePoint solution framework requires to deploy it global in the farm. (nb: the further described situation can even occur in situation were the package does contain managed code, as long as the assembly does not contain SharePoint coded artifacts which transition the scope of the Solution package to WebApplication; e.g. no WebParts, Resources).
  • The order of Solution package deployment is to first install the Base.wsp, and then the Functional.wsp. Then you would expect that the deployment dependency of Functional.wsp on Base.wsp is ascertained. However, in this scenario I discover in the SharePoint Solution store an error with the deployment of Functional.wsp. Error details:
    <server 1> : The solution was successfully deployed.
    <server 2> : The solution was successfully deployed.
    <server 3> : Feature '<GUID>' could not be installed because the loading of event receiver assembly "Base, Version=1.0.0.0, Culture=neutral, PublicKeyToken=13a12821e9f69237" failed: System.IO.FileNotFoundException: Could not load file or assembly 'Base, Version=1.0.0.0, Culture=neutral, PublicKeyToken=13a12821e9f69237' or one of its dependencies. The system cannot find the file specified.
At first, I thought this to be a timing issue. The Solution Framework uses timer jobs to deploy solutions throughout the farm, and it looked as if the Functional.wsp was tried to be already deployed on <server 3> while the deployment job for Base.wsp was not yet executed for <server 3>. However, also upon re-deployment of Functional.wsp - while making sure at forehand that the deployment of Base.wsp was now fully completed in the farm - , it resulted in the same error on <server 3>. So it must instead be directly related to the installation on <server 3> apparently the Base.dll was not installed in the GAC on that server; while the Features within Functional.wsp were to be installed on <server 3>. Why is this?
Well, the explanation is twofold. It relates to the SharePoint Farm infrastructure, as well as the scopes of the SharePoint packages. <server 1> and <server 2> have the SharePoint webapplication role in our load-balanced farm; and each contain the SharePoint webapplications. <server 3> however is dedicated as Search Index server; and does not host any SharePoint web applications. The SharePoint solution framework deploys WebApplication scoped packages only to the servers on which SharePoint web applications are hosted (Base.wsp: <server 1> + <server 2>). But it deploys Global scoped packages to ALL the servers in the farmer (Functional.wsp: <server 1>, <server 2> AND <server 3>).
Given this insight, a proper SharePoint Solution Deployment approach is to deploy the Global scoped Package with Force-flag, thus suppressing the error signalment. This worked sufficient in our case. However, it leaves you with the somewhat unwanted situation that via the Force-flag you suppress all error notifications; thus may be too brute. A better approach is to circumvent that Solution Framework deploys your Application package also to the servers on which the web application is not hosted. This can thus be achieved by making sure that each of your Application package as WebApplication scope. Either because it inherently contains such entities (e.g. WebParts), or artificially by including a dummy artifact directed at WebApplication level (e.g. an empty virtual bin file).

Thursday, December 30, 2010

Emphasis on Office 2010 hindering Duet Enterprise implementations?

In a blog of Venture Research I read the following rumor: "…about the partnership with Microsoft in Duet Enterprise, which my sources say is not advancing as fast as desired because many organizations are not anxious to upgrade to the latest release of Microsoft Office, which is necessary to derive the true value of Duet. This caution by organizations to update their underlying platform has good reasons in terms of cost and resource constraints, and SAP is not able to do anything about that."
If true, I find this a misconception and incorrect positioning of the Duet Enterprise potential. Surely, it also provides a Microsoft Office 2010 clients connection. But the true strength and potential of Duet Enterprise is that of a standards-based SAP / Microsoft interoperability foundation. It comes out-of-the-box with multiple integration plumping capabilities which you otherwise need(ed) to implement yourself. And being a commercial product, Duet Enterprise is backed-up by both SAP AG and Microsoft Corp as strong and future-proof IT suppliers.
That said, the implementation of Duet Enterprise puts its demands on the software server infrastructure in your company: SharePoint 2010 Enterprise Edition at the Microsoft stack, and NetWeaver 7.02 at the SAP stack. These are minimal necessities, without the availability of the both of them Duet Enterprise is not an option. But Microsoft Office 2010 at the client side is merely a bonus, not a prerequisite.

Thursday, December 23, 2010

SAP Connector for Microsoft .NET 3.0 released

Yesterday SAP made the renewed version 3.0 of the SAP Connector for Microsoft .NET (NCo) officially available. This version is the long awaited successor of NCo 2.0 - which still had a design-time and runtime dependency on .NET Framework 1.x and Visual Studio 2003.
With the NCo, it is possible to directly interoperate from a .NET consumer context with SAP RFC's. It is thus a different approach as with Duet Enterprise - in which the SAP/SharePoint interoperability is achieved via standards-based webservices. The Duet Enterprise services are provided by the SCL, which transparently encapsulates and shields the specifics of the SAP backend towards the SharePoint consuming side.
In the earlier versions the NCo used the SAP librfc32.dll to achieve the SAP/.NET binary interoperability. In NCo 3.0 the RFC protocol is fully re-implemented in the NCo itself. Main advantage is that there is no longer the dependency on the librfc32.dll being available on each .NET consuming system. Also this should result in better performance as there is no more marshalling required in .NET client runtime context from the managed .NET code to the unmanaged librfc32 invocation.
In the previous NCo versions the programming model was to generate at design-time .NET-based proxies in Visual Studio 2003 per SAP RFC that you intend to call. This involved a proxy method for the ABAP Function Module, and an individual proxy class for each structure or data table used in the RFC signature. In case of a change in the SAP backend, the implication is that it is required to regenarate the proxies, and rebuild plus deploy a new assembly for the re-generated code. In NCo 3.0, the RFC call pipeline is dynamically constructed. Whenever there is a change in the enclosed SAP back-end, the NCo on-the-fly adapts the internal RFC invocation. NCo 3.0 transparently shields the .NET consumer from modifications and upgrades in the SAP backend. However, this approach also has its downside. You are in effect developing against a weakly-typed integration API, much resembling the .NET reflection pattern. There is no compile-time validation nor protection for incorrect RFC invocations.
The NCo 3.0 can be downloaded from SAP Service Marketplace, including accompanying documentation.

Tuesday, November 30, 2010

Participating in Duet Enterprise Ramp-Up / Rapid Deployment Program

This blog is earlier published on SAP Community Network Blogs
In May this year, TopForce applied as IT partner together with an end-user organization for participating in the Duet Enterprise Ramp-Up. Rationale for both organizations to participate in this Rapid Deployment Program is to gain in an early stadium insights and practical experience on the potential and added value of Duet Enterprise.
Our mutual application was accepted by the RDP Program in June. As one of the first activities, both organizations each sent 2 to 3 employees for Duet Enterprise training. Actually this was a requisite for RDP participation. The 5-days training addressed both the infra/operations aspects of Duet Enterprise in the SAP and Microsoft landscapes [3 days], and the development approach in the SAP and Microsoft development suites [2 days].
After this first introduction to the Duet Enterprise product, we received access to the software bits and accompanying documentation. The last mostly in draft versions, understandable as the product is still in Ramp-Up.

Reasons for participating

The mission of TopForce is to deliver the concept of High Performance Workplace for our customers. In essence this means that we aim for ease of use in the daily workspace of the employees, thereby abstracting the nifty details of the total backbone IT landscape. As our background comes from SAP consultancy, at a lot of our customers SAP is [part of] the business back-end. In our HPW vision, one of the means to achieve a pleasant employee workplace is operating the SAP back-end via a SharePoint based front-end (Unlock the value of SAP business processes within a SharePoint based HPW). We defined a conceptual integration architecture for this SAP-backend / MS-frontend interoperability. The architecture is validated via multiple interoperability technologies and products; e.g. WCF to BAPI WebServices, WCF LOB Adapter, Sitrion. Drawback in all was that we still had to develop a lot of the interoperability plumbing ourselves, most noticeable the support for Single Sign-On. Duet Enterprise now promises to be a good (best?) additional alternative for enabling SAP / MS interoperability.
Our partner shares this same goal, but also has a clear additional target. In the recent history several projects have emerged in which SAP / Microsoft integration might be subject for business and IT architectural decision making, and/or in which SAP / Microsoft integration is concrete realized in some aspects. As the role of both SAP as Microsoft server landscapes is gaining importance at this end-user organization, they have the need for consistent design guidance on when and how to achieve SAP / Microsoft interoperability. Duet Enterprise is participated in this context as an important integration technology to consider.

Project team

Our RDP project team consists of the following roles / persons:
From the end-user organization
  1. Project lead
  2. SAP solution architect
  3. Microsoft solution architect
  4. SAP business analyst
  5. SAP infra/operations
  6. SharePoint infra/operations
  7. SharePoint developer
  8. SAP NetWeaver solution architect / consultant
From TopForce as IT partner
  1. SAP / SharePoint interoperability consultant (my participation in this Ramp-Up)
  2. ABAP developer [only temporarily required]
From the combined SAP / Microsoft RDP support team
  1. SAP NetWeaver consultant [from SAP AG]
  2. SharePoint consultant [from Microsoft Services]
  3. Whenever applicable, supplemented with SAP and Microsoft consultants with varying expertise’s; dependent on nature of discussions and encountered issues

Validation approach and activities

The potential of Duet Enterprise is twofold. On one side it delivers directly usable out-of-the-box functionalities, customizable in some degrees to your situation. Second to that, and this is where it strongly differentiates from the original DUET proposition, it provides a SAP / SharePoint interoperability foundation. Although the first is certainly interesting, our main combined focus for now is evaluating Duet Enterprise for the added value on interoperability plumping aspects. This validation is done by means of a real-life custom-developed application. The following activities are executed by the RDP team:
  • Attend the Duet Enterprise training to get acquainted with the infra and development aspects
  • Install and configure Duet Enterprise in the SAP NetWeaver 7.02 and SharePoint 2010 landscape
  • Derive and define the Software Architecture Design for the selected real-life application
  • Discuss the Software Architecture Design with RDP consultants, from SAP AG and Microsoft Corp.
  • Model, compose and [only] where required custom-develop the interoperability + integration between SAP backend environment, and SharePoint 2010 based front-end
  • Derive and define the Design Guide for SAP / Microsoft integration decision making at business and IT architecture level

Where do we stand / Results so far

In this near-end phase of the Ramp-Up program, approaching the General Availability of Duet Enterprise, it is not viable to go into product details. This will be addressed later, after the successful conclusion of our Ramp-Up participation.
Results and findings that I can share now are:
  • Getting acquainted with the design and development approach for Duet Enterprise application has an initial steep learning curve. Mind you, this will be less after General Availability when the accompanying documentation will be more complete and mature.
  • Installation and configuration of Duet Enterprise takes its time. This is mainly due the complexity of any typical SAP landscape anno 2010 (aka ECC, NetWeaver, Solution Manager, SLD, ESS and MSS, …), and not so much to direct back to Duet Enterprise specifically. On SharePoint 2010 side the work is much less, although also here it depends on the characteristics (complexities) of the SharePoint farm.
  • This RDP project is an evidence for the added value of a mixed SAP / Microsoft development team (Build Your Interoperability Team as 1 of the Steps HowTo begin with SAP / MS interoperability)
  • It pays out if the SAP and Microsoft solution architects are on a general level familiar with the ‘opposite’ platform stack. E.g. what’s the positioning of SAP PI versus Microsoft BizTalk; what is the concept and support of SAP Enterprise Services;
  • It pays out to have at least one team member aboard that has knowledge and practical experience on the SAP / MS interoperability area; knowledge of both technology stacks, interoperability technologies and approaches.
  • And final remark: SAP / MS interoperability is and remains interesting stuff; from business and IT architectural point of views…
Duet Enterprise is approaching the General Availability date. If you are considering its application for SAP / Microsoft interoperability, I like to share ideas and actions.

Thursday, November 4, 2010

2 approaches to architecting SAP / SharePoint interoperability

This blog is earlier published on SAP Community Network Blogs
In today’s market the interest is growing to integrate the structural business processing of SAP within the familiar workplace environment of the Information Worker. These enterprise workplaces are in more and more companies provided by means of SharePoint 2007/2010. To achieve the SAP / SharePoint integration on structural and future-proof manner, requires investigation at forehand to come up with a solid interoperability architecture.

Main steps to define a solid SAP / Microsoft.NET interoperability architecture

  1. Derive and define guiding principles; originating as first from the business perspective, and next from IT. Typically the latter are more of constrictive nature; e.g. required to apply a service architecture, required to conform to W3*-standards, ...
  2. Analyze the current state of the IT landscapes (‘IST’) within the company: SAP and Microsoft environments and server products.
  3. Define and describe the interoperability architecture
    • Conceptual level; on purpose technology and product agnostic, to make it more timeless and future-proof
    • Concrete level; with a choice for interoperability technologies and products that are nowadays available
  4. Validate the interoperability architecture by means of either a Proof-of-Concept, or via a small launching interoperability project.
  5. Adjust the defined interoperability architecture on the lessons learned.

Guiding architecture principles

  1. Layered architecture, with separation of concerns. A typical layer architecture is:
    • Presentation
    • Integration
    • Application
    • Data
  2. Service architecture
  3. Loosely coupled
  4. SAP business backend is and remains responsible for the correctness of business process
  5. Responsibility of business data consistency within the SAP backend layer

Approaches to derive the interoperability architecture for a concrete case

When in context of a concrete application, there are basically 2 approaches you can apply to derive the layered interoperability architecture:
  1. Inside-Out
  2. Outside-In (nb: in Microsoft terminology, this approach is also referred to with the phrase ‘Contract-First’)

Inside-Out

As the name already suggests, the starting point here is your current SAP landscape state. Start with identifying the available functional building blocks in the SAP environment – existing BAPI Function Modules, RFC’s, SAP workflow business objects, and already available SAP webservices. And expose these to outside world, to have the related SAP data and functionality consumed by a non-SAP front-end.
Advantages
Biggest advantage of this approach is that you have a faster time2market. You can base your SAP / Microsoft.NET interoperability on already existing, and thus also tested, SAP building blocks. The only thing that needs to be done is to put a (web)service interface on them, and then you can integrate with the Microsoft based presentation layer.
Disadvantages
This can be summarized with the phrase ‘Garbage In – Garbage Out’. If you base your interoperability architecture on the current state of your SAP environment, there is a large likelihood that SAP-proprietary concepts will be visible on the integration surface level. In general, an architecture that originates via this approach will be less pure and transparent. And thus less future-proof.

Outside-In

The essence of the Outside-In approach is to first agree on and define the conceptual contract [interface] between the service provider side [SAP], and the service consumer side [SharePoint]. The idea is to start from the requested application functionalities. Derive and define at a conceptual level the services you require from the SAP backend to deliver the application and system functionalities. Describe the service interfaces in W3*-standards based notation and data structures. From here on, map onto required SAP building blocks: existing if available, new ones otherwise. At SharePoint / presentation side, you can build the consumption layer for the defined service interfaces via wsdl.
Thus start at ‘outside’ with the conceptual, externally visible service interfaces; and continue then for both provider and consumer / SAP and Microsoft sides to the inside with their respective technologies.
Advantages
Because you start in this approach with a green field, the resulting interoperability architecture typical has a cleaner interface, which inherently conforms to interoperability standards. And because you start from the required business services, it also has a better chance on being conceptual correct, and future-proof.
Disadvantages
Biggest disadvantage is that this approach requires more investment and time at front in deriving and describing the service interface layer. Also it requires to get both SAP and Microsoft departments representatives on par, to have a common understanding of the applied service concepts. And in case of existing SAP building blocks, a transformation can be required to map the standards-based service interface to the SAP-specifics Function Modules, RFC’s, workflow business objects.