Wednesday, November 28, 2012

Teamcenter SOA : Introduction

Service Oriented Architecture (SOA) framework is offered by many enterprise product vendor due to its advantage of interoperability as well reusability.  Also due to service based framework based on buisness use case maje SOA API are easy to use as it hide all complexity to Application Developer. Teamcenter also offer SOA framework for customization as well for integration with other Application. In series of Blog's I will provide detail concept of Teamcenter SOA framework and creating your own SOA based on Tc SOA Framework.
Teamcenter UA SOA :
Teamcenter provide SOA framework as well set of out of box SOA service for direct consumption. Teamcenter SOA can be basically used in two ways.
1)      Using OOTB SOA service as SOA client.
2)      Creating your own SOA which can consume by others.
Teamcenter SOA support following language presently C#, C++ and Java. Development can be done in any of above language either using OOTB SOA service for Application Development or developing your own SOA for other developer usage. The list of SOA service can be seen in BMIDE under extension -> Code Generation->Services. It provides all the list of Service available for given Teamcenter environment. Also you can get all detail of Data Type and Operation corresponding to SOA services in the BMIDE as shown in below image. We will discuss in detail about Data Type and Operation in future blogs.

Teamcenter SOA Framework:
Teamcenter SOA service Framework provide set of connection protocol like HTTP, Corba and auto generated stub in the  server  as well Data Model to support client application. SOA server architecture resides above Business Object layer (AOM layer). SOA server code can call ITK API to perform business logic as shown in below diagram.
Teamcenter SOA is set of API or programming interface used for application developer. The API libraries are present  in soa_client.zip file on the Teamcenter software distribution image. The libraries are present inside soa_client for respective supported programming language Java, C++ and C#. This ZIP required to be extracted preferably in TC_ROOT folder for linking Application code which usage SOA service. soa_client.zip also contain some sample SAO code in all supported language.We will see in my next blog how to use SOA API , establish connection through SOA and use OOTB SOA services.

See Also :
Teamcenter SOA : Using OOTB SOA Services
Temcenter SOA : Sample SOA Code Setup

Teamcenter SOA : Create your Own SOA
Teamcenter SOA : Detail Step for Creating SOA

Sunday, October 28, 2012

Teamcenter Report Framework

Report is one of the key modules in any PLM system. And with the advancement and more emphasis on Analytic and Reporting on Enterprise data, it is going to play key role in future. Till now Report in PLM are static and so called dumb data. They just provide data in term of data extraction and transformation mainly done through style sheet. Now with the advancement of technology and powerful system report is become more complicated and expected to provide answer rather than just dumping the data in nice format. Teamcenter also provide tool for creating both static as well intelligent report. In this blog we will discuss on static report which can be created in Teamcenter through in Build report module. The functionality for complex and Analytic Report is also available in Teamcenter through third party tool.

Reports in Teamcenter: Teamcenter provide to Report Builder Module to create report based on PLM data from Teamcenter. Reports in Teamcenter are based on Query and PLMXML Framework. The basic concept is to get the objects from Teamcenter through query which is basically converted by Teamcenter server in to PLMXML output on top of which xsl style sheet can be applied for layout and Report UI. PLMXML output can be controlled in same way as of creating Transfer mode for exporting/importing object from Teamcenter. I already cover PLMXML in my earlier blog . Reports Builder always create Static Report Report Builder. Reports are basically categorized in three types in Report Builder.
1)      Summary Report: These are basically general overall report where report is not in context of specific object. Ex: Change Object Status Reports
2)      Item Report: These reports are generated in context of specific object which generally selected by user to create a report. Ex: Signoff Report for CR where CR required to be selected.
3)      Custom Report: Teamcenter provide custom Hook for creating Custom report which usually can’t be created through Query or XSL. A custom exe can be written which can create a report when executed by user in Teamcenter session.
Report Builder Modules Component:
Report Builder five major components which basically define the extraction and transformation rule for Reports.

Query: This is search query created in Query Builder module of Teamcenter. Query defined which object required to be extracted under different search criteria. Search Criteria can be either based on some property of Target object or based related object (Ex Item Revision as Target and Dataset as related object). The search criteria either can be exposed to end user or it can be hidden. I covered Query in detail in my Query Builder Blog.
Closure Rule: This component comes from PLMXML module. It defines the extraction rule for the object extracted from Query. The Closure rule defines what other related objects required to be extracted along with target object coming from Query. See my PLMXML blog to know more about Closure Rule.
Property Set: This defines addition attributes required to extract apart from default property for given Object Type by PLMXML Engine. See my PLMXML blog to know more about Property Set.
Report Format: This define XML format for report extracted from Teamcenter. Teamcenter two format traditional PLMXML and TcXML based on new schema available in Teamcenter Unified.
Report Style Sheet: This defined the Layout of the report. This style sheet is XSL files which can convert the XML output from Teamcenter in to different compatible format with proper layout. You can know more about style sheet in W3C site or Wikipedia. Most of the time customization of Report will be limited to XSL creation based on Report Layout requirement.
Custom Component: This is used when you want to create custom reports through some customize code. We required providing path of custom executable and parameter if any required for custom executable.
Most of the generic report can be created from Report Builder without required for any customization. But the limitation is that this report will be always static report and no intelligence can be provided (Filter criteria, If-Else Analysis etc). Also there is no way to fetch data from different source and normalize with Teamcenter for Reporting. 
Due to above limitation Teamcenter also provide tightly integrated Teamcenter Reporting and Analytic for advance report from different data source as well for Analytics.

See Also :

Basic of PLMXML Export/Import

Teamcenter Query Builder

Saturday, September 15, 2012

PLM Migration : Part 2


This is continuation of my previous blog on PLM Migration Part 1. In this blog I am going to discuss about other PLM migration consideration . On my previous blog we discuss about

·         What to Migrate
·         How to Migrate
·         Before you start
In this blog we will discuss in brief about remaining factors
·         Design Approach
·         CAD Migration
·         CAD vs Meta Data Migration
Design Approach:
Design of Migration required having better architect and validation mechanism. Also log and Report creation mechanism should be in place. Most of complex migration is done through ETL process mainly through staging data base where transformation, normalization and cleaning is done. Following point should be considered while doing Design through ETL process.



1)      Clean interface data load and Staging Table design should be defined. It is recommended to have separate table for input data and for transformation. This will help in better traceability and data validation.
2)      All exception related should be log in Error table and appropriate report should generated for further action on those exceptions. This will help in saving lot of time at upload stage since most of issue can be caught in transformation stage.
3)      Sequence of interdependent upload should be well defined and documented. Since in PLM is a complex web of interlink data the hierarchy of data dependency as well sequence data load is important to defined.
4)      If there is multi system migration then proper validation procedure should be written for integrity of data and its accuracy from different system
Typically following High Level flow involved in ETL migration.
1)      Extraction from legacy system.
a.       Meta Data Extraction
b.      Non CAD Files
c.       CAD extraction
2)      Load Meta data to staging
3)      Input data Validation
4)      Transformation
5)      Post Transfer Validation
6)      Staging Extraction for upload
7)      Data Upload.
a.       CAD Upload
b.      Metadata and Volume files Upload
It is important to do Design for sync of Metadata, File and CAD files upload. All three are interdependent on each other. For example CAD load will may required property like type, part number etc from staging for CAD upload.
CAD Migration:
CAD Migration is one of the difficult aspects of PLM migration. Usually CAD migration is usually done through third party tool which has CAD integration capability with the given system.  For example Siemens PLM provide UG NX tool like ug_export/ug_import and ug_clone for export/import CAD part from Teamcenter. These tools provide lot of configuration option for doing custom export/import to the PLM system. These types of tools are usually used for migrating CAD part by providing proper configuration file.
The most consideration for CAD migration is to understand modeling philosophy of organization like master model concept and inter part relation. For in NX you can create CAD model which are dependent on other part through expression or geometry. The above understanding is very important to define the process and sequence of CAD data. Following should be considered before deciding on CAD migration.
1)      Modeling concept used in Organization.
2)      Interdependent part relation in CAD model like expression, wave link, pattern, template etc in NX.
3)      Categorization of part in different bucket to decide on sequence of migration like library part, assembly or sub assembly etc.
4)      Understand if non master file like CNC or drafting file etc. Understand there dependency on other parts.
 Based on above analysis defined the CAD migration strategy.
CAD vs Meta Data Migration:
It is also important to understand dependency between Meta data and CAD in PLM system. In most of PLM system CAD integration with PLM system also depends on objects relation in the System. For example wave link in NX also create a Teamcenter relation.  Also CAD migration will also required migrating CAD files based on new part number and types. This information is a part of metadata but required for correct migration of CAD in new system. Hence this information will available from staging DB.
This closes my discussion on Migration. I try covering important aspect of migration Analysis to Design phase. There are other important aspects of Migration like testing which itself is a topic on its own. I will try to cover it seperate blog.

See also :

PLM Migration : Part 1

Sunday, September 9, 2012

PLM Migration : Part 1

One of the most challenging aspect  in PLM field  is data migration. It not only required metadata migration but also vault files system specially CAD files and this create higher degree of complexitiy. Not only it required to make sure that all object and relation are migrated correctly but also sanity of interpart relation related to CAD should also maintain. In this blog I will discuss the factor required to be consider for PLM migration. Since migration it big topic, hence I will split the contain in to two parts.  
Migration Consideration:
Following are the important steps  for PLM Migration
·         What to Migrate
·         How to Migrate
·         Before you start
·         Design Approach
·         CAD Migration
·         CAD vs Meta Data Migration.
What to Migrate:
For any migration you have to know what required to be migrated. In PLM migration there will three aspect of migration
1)      Meta Data Migration
2)      Document File Migration
3)      CAD Migration
Meta Data Migration is migration of data stored in DB from one system to other. Where Document migration of actual files which is maintain by one system to other. CAD migration e
From system migration perspective it can be either of this
1)      Single System migration i.e from one system to other system migration
2)      Multi System migration. It can either of
a.       Multiple to Single System Migration.
b.       Multiple System to Multiple System.
For complexitiy perpective Multiple System to Multiple System will be most complex. But they are usually break down to various project to make them Multiple to Single System migration.
To define what to migrate we have to do first have good understanding of old and new system. Also required to have understanding of Buisness process. As in many cases moving to new system also involved change in Buisness process to leverage the advance functionality of new system. Hence proper analysis required to be done for Old vs New process as well GAP analysis for finding any short coming in new system functionality.
Once the above analysis is done then it come what data required to be migrated. In all cases metadata mapping  play prominent role as they represent real mapping of Buisness process to Object. Following aspect of Metadata mapping has to understand properly.
1)      What are Buisness object from present system are required to migrated to new system.
2)      What are the property of Buisness object required to be Migrated.
3)       What relation between objects  is there in old vs new system
Object Mapping is most important aspect of migration. Success of any migration will depend on effectively and correctly the mapping is done. Mapping with will be usually boil down one of this scenario.
1)      One to One Mapping : One object map to similar object in new system.
 Ex Part to Part migration
2)      One to Many Mapping : One object map to many object in new system:
 Ex SAP Routing can map to Item and Item Revision.
3)      Many to One Mapping : Many object map to one object in new system.
 Ex: Two Object with relation object merge to single object in new system.
4)      Property to Object Mapping : Any Property in old system will map to Business object in new system.
Ex : Work Area property of Routing Operation in SAP map to Workcell Item in Teamcenter.
Next step for migration is related to approach to Migration
 How to Migrate:
Once the What context of Migration is clear, then come How. The approach is defined by considering many factory. Factor can be categorize in following points
1)      Bulk vs Phase wise migration
2)      Custom Migration vs Thrid Party MigrationTools.
Bulk vs Phase wise migration : Bulk migration involve full migration in one go. Following are the characterictic of Bulk Migration
ü  Complete movement of all data to new system at once.
ü  All users switch over at the same time.
ü  All processes move to New System at once.
ü  Source system should not be modified after migration (read only)
Phase wise Migration : This involved migrating phase wise in a specific duration may be few months to few years.  This required to have both old and new system to be co-existing  for whole migration duration. Also may required to have sync interface between old and new system during migration period. Following are the charactertic of Phase wise Migration
    • Movement of specific data to new system based on certain criteria. Ex Group wise, project wise, workflow based etc.
    • Set of users switch over the time.
    • Co-existence of old and new system for some duration (From months to a year or so).
Bulk Migration
Phase Migration
Lower overall costs
Overall Higher Cost as two system required to maintain
No synchronization required between multiple legacy and new system
Synchronization required between old and new system
Change Management  and Training challenge is high.
Better Change and Training Management due to Phase wise migration
High impact in case of Migration failure.
Roll back can be easily done as old system coexist with new system
Suitable for Simple to Medium Complex Data. Also simple CAD with few BOM Line and depth
Suitable for Complex data mode and huge CAD assembly.


Custom Migration vs Thrid Party Migration: Second decision point is too made on Migration tools. Basically Migration tools can be either Custom Tools required specifically for migration or Vendor Tools. Most of Vendors provide offer Out of Self tools to migrate in their native system. For example Siemens PLM offer GMS for migration from Legacy system to Tc UA. Similarly other PLM Vendor provides similar tools. Analysis required to be done whether those tools are suitable for migration use case or not.  It is observed that PLM vendor usually claim to cover all use cases but in reality there GAP. This is truer if new PLM system has own migration tool.  This can lead to have more effort and cost then having custom approach.
Custom Migration approach is more suited if migration is from multiple system and data model is complex. Custom Migration gives more flexibility and control in migration. But at the same time it requires more effort and upfront cost for development and regressive testing. Most of time Custom Approach is done through ETL (Extract, Transform and Load) process where transformation is done in staging database. Following are the characteristic of both approach.

Custom Migration Approach
Third Party Tool Migration
Higher Upfront Cost
Only License cost involved
Suitable for Multi System Complete Migration
Suitable for simple migration supported tool
Regressive testing required
Limited testing required as it certified by provider.
Flexibility in case change in Business cases
Can be costly and time consuming if the Tool doesn’t support those use cases


Custom Migration approach also not really means developing everything from scratch. Most of time custom solution also involved using out of box utility provide by PLM system. Usually wrapper is written on top of them to make seamless migration. For example migrating CAD data always involves Vendor provided import/export tool which required to in sync with Meta data which can done through custom tools.
Before you start:
Before we start actual migration design it most important to have following aspect should be closed.
1)      Data mapping is completed.
2)       Also all Gap analysis and accordingly approach should be finalize.
3)      Expected Duration of actual Migration and delta approach.
4)      Infrastructure requirement mainly Server Capacity, Space etc.
We already discuss about Data Mapping and Migration Approach. It is also important to plan for expected duration because that will also defined strategy for delta consideration during migration process. Delta means all changes done between actual migration start and completion in native system. Since user can start working in new system only when the whole migration is completed, it is import to bring transition changes also in new system. This is done through Delta Migration planning. Delta migration comes in to picture when the migration is expected to take more few days and changes can moved from old to new system manually.
Infrastructure requirement is also important factory as it required considerable planning. Typically infrastructure requirement is calculated based on amount metadata as well volume required to be migrated.
This closes the first part of PLM migration blog. In this we discuss about factor required to be consider for migration. In the next blog we will discuss about Design and CAD aspect of migration.

See also :

PLM Migration : Part 2

Thursday, August 30, 2012

Basic of PLMXML Export/Import

Basic of PLMXML Export/Import
PLMXML is a very powerful tools for importing/exporting data and file from Teamcenter to external system and vice-verse. PLMXML is Siemens PLM sponsors XML schema for exporting/importing metadata as well files from Teamcenter to other system,
  The advantage of PLMXML is its flexibility of defining rule for data extraction and import make it one of the widely use tool for integration and data exchange. It is also widely used in teamcenter other module like report builder and integration. For defining the rules, admin module is present where new rules can be created or modified existing rules. This rules are called Transfermode. In this blog we will discuss in detail the transfermode and its child rules.
Transfer Mode:
Transfermode encapsulate the rules which defines import/export data from teamcenter. It basically govern the Export/Import rules and meta data which required to be extracted from Teamcenter. Transfer mode mainly consist of
·         Closure Rule
·         Filter Rule
·         Property Set

Closure rule: It defines scope of data transfer. Basically it tells how to traverse from target object to its related object and whether to process or skip the object. For example if you are exporting Item Revision and also you required to export attach dataset with specific relation and type, you required to defined this detail in Closure rule.
Closure rule compromises five fields, primary object selector, secondary object selector, relation selector, action, and an optional conditional clause.



Primary object selector is the target object from which the traverse will be defined for its related object which is called secondary object selector. So as in our example for Item Revision to Dataset, Item Revision will become Primary object and Dataset will become secondary object selector as rule will be traverse from Item Revision to Dataset. Primary or Secondary object selector can be either of Buisness Type or Class. The third field relation selection defines how the primary and secondary object are related. Object in teamcenter can be either related through relation, attribute ,property. Other is refby which tell that secondary object refer the primary object. Also there is other two specific to BOM which is occurencetype and content.

Action field tell the system what required to done with secondary object. It take this words.
Skip : The secondary object should not be exported and no further traversing from object is required.

Process: Process the secondary object (extract object), but no further traversing.

Traverse: Don’t process the object (no data extraction), but traverse further for other related object as defined in closure rule.

Alter_closure_rule: Change closure rule for secondary object

Only Skip action can’t be combined will other. Process+Traverse means process the secondary object and traverse further to process its related object.

The closure rule is sequential, means the secondary object will become primary if you want to extract data which is related to secondary object. For example if you want to extract the file of Dataset when Item Revision is exported then we required to define new rule in closure rule which define the traverse rule from dataset to Imanfile.

Property Set: It defines what property can extracted from process object. Closure rule which object required to be process, whereas Property set defined what data/property required to be extracted for those object. By default Teamcenter extracted some set of property for different type of object.  For example revision id for item revision. But user required to extract further information then those required to defined in Property Set and add in transfer mode.
It constitute this following field.
Class or Type : Defined class or type name.
Property or Attribute: Define the attribute required to be exported.
Action: Define whether to process the property or skip.

So for example if we required to extract the description property of Item Revision, it required to be defined as shown in below figure



Filter rules : Filter rules allow a finer level of control over the data that gets translated along with
the primary objects by specifying that a user-written function is called to determine
the filter applied against a given object. Basically it is required for complex scenario where closure rule can be in sufficient to control PLM Data Export/Import. Mainly for condition like if, else if cases and required customization.

Action rule : Action rules are set of method which can be call for before, during, and after the
Translation. Teamcenter provide some of out of box action rule. But client specific cases usually required customization. In TcUA action rule are depreciated and replace by BMIDE extension mechanism