Sunday, November 20, 2011

Approach for Successfull PLM Implementation in a Enterprise

PLM implementation is a complex system integration which not only involved technology implementation but also process streamlining and organization change. More often than not the PLM system fails to fulfill its promise not due to technology limitation but failure by organization on process and change management. In this blog I will try to explore various reason of failure due to inability of Enterprise to do good Change Management and approach to overcome this limitation for successfull PLM implementation.


Organization Change Management
Adaptation of PLM in any enterprise usually starts with big fun fare and expectation. But down the lane people realize that technology implementation is only one aspect of it. The other aspect is related to process and Change Management within the organization. For example there is a big giant company which has spent hundreds of millions of dollar on PLM system, but still struggle to stream line the part management and retire there old legacy system. The part of problem lies on the approach the company usually takes for PLM and part lies on the Vendor whose product they implement. The company usually starts approaching PLM purely from IT perspective rather than core business process management. Hence it results in PLM system selection purely on technology and functional richness, rather than its adaptation and suitability to the company. Since PLM looks from IT perspective, it also lead to internal team selection from IT department only. Where as it should be mix of business and IT people.


PLM Vendor Selection
PLM vendor ]are more interested in selling their product rather than providing business solution to the client. This makes them to blend the actual business process issue of the client to suit their product offering. This always led to less user level adaptability once the PLM is implemented and extra millions dollar by enterprise to make in adaptable by heavily customization which earlier avoided due to vendor promise of configuration rich product. Also PLM vendor are usually not good in Organization Change Management which usually a key attribute for PLM successful adaptation in any Enterprise.


Enterprise PLM Vision
The other aspect which usually define failure or sucess of PLM is managing future implementation and changes . Usually PLM started in any Organization with good long term vision. But as time goes, the vision goes on back seat and focus shift from strategy to providing immediate solution to user problem and requirement. This usually makes PLM system more complicate and less agile in long run. This change of track can be seen most of time in big organization and the reason can be many. One of the reasons is lack of any PLM strategy management team which basically should define and track long term goal of company. Second mainly in big enterprise is that in many cases there are independent powerful businesses vertical which want drive the change in PLM system based on their narrow vision and requirement rather than looking at broader vision and strategy on enterprise. This lead to deviation from vision defined in early stage of PLM implementation. Hence the best approach is to have PLM strategic team having from Top Management of different business unit which monitor the progress in a specific time interval. Any deviation from vision should be approved by this strategic team.


Summary
Below finer points summarize the above discussion.
1) Looking PLM purely from IT perspective is a mistake. Always look PLM from Buisness process perpective rather than information management tool.
2) Internal PLM implementation of a company should have both from IT and business user.
3) PLM system evaluation should be purely done from organization requirement and long term vision rather than feature offering of PLM Vendors.
4) Strict validation of Vendor product offering should be done against business requirement.
5) Organization Change Management is most important but often ignored aspect in PLM. It is recommended to have strong change management team. Prefer to engage external professional with dominant experience organization change management.
This is my view on it. Please open to provide your view and comments.

Saturday, November 12, 2011

PLM in a Cloud : Limitation and Approach

For last couple of years we are lot of buzz is created regarding cloud computing and its promised benefits. Still required to be seen its benefit as word are going around as Enterprise are still struggling to get best way to approach it. As with other domain in IT, PLM in a cloud is also hot topic around in PLM world. Well lot of noise about its obvious benefits and cost saving, but there is basic different on PLM to other areas. In this blog I will discuss this limitation and some architect approach to implement PLM in cloud.
PLM Cloud Limitation:
When we talk about PLM, it not just includes information stored in byte as Meta data somewhere in database. It is much more than just discrete Meta data. It include product information which can be Big CAD geometric file which can in hundred of MB or can be some excel information storing some patented engineering information. Also all this complex data files are associated through relation provided by different PLM tools. So making these diverged files available in cloud itself is a big challenge. Secondly PLM is usually fully implemented by mid size to big corporation whose main objective is to stream line product development and safe guard know how rather than cost saving on this critical IT piece. So looking at the above aspect probably full offering of PLM in cloud will not be a good idea. To summarize this are the limitation of PLM in a cloud.





  • PLM is not just about metadata. The major chunk is vault or file management. Managing this files which can be as big as some hundred MB is difficult.


  • Company will be reluctant to have their core product knowledge information in cloud.


  • PLM is cluster of different tool which make the overall PLM system in any organization. This cluster of tool can be varied from Enterprise to Enterprise. Offering homogenous PLM tool in a cloud can’t solve the problem.


  • Full implementation of PLM is usually done by mid to large size enterprises with deep pocket. Cost saving can’t be only reason to move to cloud.


Having said that, still there is scope of implementing PLM in cloud with some clever architect and defining it scope. For example vendor collaboration can be done through Cloud so that they don’t required to access you core PLM system or network. I am proposing some PLM cloud architect which can come some of limitation.




PLM in a Cloud Architect proposal :
Option 1 : Vault in local environment and PLM in a cloud :
This can solve the problem of transferring data from cloud to local environment. For example CAD data which can Mega bytes. Also Company will feel more secure as their core product information lies with in Enterprise Network.



Option 2 : Partial PLM in a Cloud for Collaboration :
This can be effective use for vendor collaboration as well for multi site environment. This will help the enterprise to define access in a better way


Conclusion: Although Cloud computing is a promising technology, but looking from PLM it has some limitation as discussed in the blog. But still if we apply right approach on implementing it rather than going with full PLM in a cloud, enterprise can still leverage advantage of cloud. I discuss two conceptual approaches for it. I am sure there will many other approaches for it. I welcome all to comment and discuss on this topic.

Saturday, November 5, 2011

PLM Implementation : Design for Support

One aspect of software design which rarely given due attention is design for support. It means that design of support is done is such a way that the Support activity in case of any break down or failure can be easily traceable and resolved in quick time. As the PLM deployment is at enterprise level, the support consideration has to be taken during implementation stage. But I rarely seen that due importance is given to it as it required to be given. There are many reason for it, but one of the reason is the time line and complexity which required to be deal for PLM implementation and support accept only come once the system goes to go live. Here I will discuss some critical consideration required to given while designing PLM solution.
What we mean by design to support? In sort it mean design should be done in such a way that all unexpected cases can be handled appropriately and propagated to relevant system so that proper action can be taken. One question arises how it is different from error handling? There is quite different between generic error handling and exception handling due to environment instability. Example of generic error handling is string size to large short of thing. Exception handling for support usually required corrective action from admin or support person.
Support Consideration during Design:
While Designing the PLM system it is important to list down all exception which will require corrective action from support. Ideally a Design review should be done with Support team before signoff. But it is rarely done in new PLM implementation project as implementation is done by one vendor and support activity is done by other vendor. So coordination and making them in sync is always a challenge. Based on my experience following point should be considering from support perspective while designing the PLM system.

Design for Support Consideration and Approach
1) Define or discover present support strategy of Organization.
2) Find out the tools presently use by Organization of Infrastructure management. For example some Enterprises use standard tool like System Center Operations Manager (SCOM) which also monitor health of application in remote system. Understand the infrastructure management tool so that PLM system can be integrated with it.
3) List all exception for all Business cases where user can required help of support.
4) Categorize each exception based on priority. Here a Business analyst can be consulted.
5) Define appropriate message for each exception.
6) Define and document appropriate corrective action for each exception.
7) Design PLM Implementation to accommodate this Exception cases and required appropriate exception output from PLM system so that it can be handle by Infrastructure management tool.
8) Have an approval from all stake holder mainly from support team who is going to own the support activity.
Design for Support is important aspect for PLM implementation but rarely given much consideration. If due consideration is given during design stage will help to reduce the support resolution time and also can save considerable amount of cost for a organization in long run.

Sunday, October 23, 2011

Configuration Based PLM Implementation

Presently configuration based Software development is highly desirable and demanded by client. PLM products are coming with high level configuration with the aim of reducing the solution cost to client. But in most of cases in PLM domain client have some way or other unique feature which can’t be satisfied with configuration and some customization required to be done. Client prefers to have custom solution developed so that future changes can be incorporated through configuration rather than new implementation and the solution can be reused in similar requirement situation. This can reduce the future cost to them. In this blog I will discuss different way of configuration in Teamcenter and balancing between configurations driven development. As overloaded configuration based implemented may kill the real benefits and increase cost of development and maintenance.

Configuration Driven Implementation:
Configuration driven implementation can be define as development which can incorporate future change of requirement through change in some property or attribute define in a system. Basically it can be said that change can be done without changing the code. So configuration driven implantation laid its foundation at requirement itself, as the constraint of requirement define how much configuration approach can be taken. If the requirement is too specific and unique than configuration approach may be not a good idea. But if requirement is generic enough that configuration driven approach can be taken. For example if requirement presently is to process some logic for specific part type only but it future other part type can also have same business process. Also while doing design of configuration approach it should also be taken in to account whether there is any possibility of change in requirement. For example rarely any Organization changes there Change Management process once develop. Making this solution, too much configuration based may not be sound approach. Also other factor is reusability, usually for example Workflow Handler can be written in such a way that it driven by argument and can be reusable for similar case but with different business object type and status. To summarize this are the three factor which define for going for Configuration approach.

1) Requirement is generic or too specific.
2) Expected change in future.
3) Reusability of implementation.

Configuration based approach in Teamcenter:
In teamcenter you can drive configuration based approach through various means. The main tool which is also use in Product provided Configuration is Preferences. Preferences are internal Teamcenter environment variables store in database. Teamcenter provide various API both at server and client side to access those preferences. One of the advantages of preference based approach is that Teamcenter provide different level of preference control based on Site, Group, role and use. Also access control for edit for preferences is also defined. Hence different configuration can be wisely used . Second approach usually is through Configuration file. We define set of property in specific format which is read by code during runtime. The config file usually help in specific location usually in tcdata directory. This approach has its limitation as specific code to be developed to read and understand the config file. Also it not store in Teamcenter environment. Third approach is used for Workflow Handler development which can be configured by providing argument and its value while designing the Workflow. This approach widely used for making Handler behavior generic enough to be driven by argument. For example Handler can check state of target object type. The object type can be defined in argument.
To summarize different approach Configuration driven implementation in Teamcenter usually can be categorize in three types.

1) Preference based approach.
2) Configuration File based approach.
3) Argument based approach.

Usually implementation is driven by mixed of the above, but most prefer way should be preference based approach.

Saturday, October 15, 2011

Customizing Translation Services

In my last two blog I have given detail about Teamcenter Translation Framework and its Configuration. In this blog I will provide about customization and creating new translator service. As discussed in Translator Framework, the three main components are Dispatcher Client, Module and Scheduler. For any customization this component required to be customized. In all most all cases Dispatcher and Module required to customize. Where as in Dispatcher client you required to right a code for extracting or loading or both and for Module required to define calling exe or bat fill which really do translation. Also required to configured translator.xml as discussed in my earlier configuration blog.
Customization Steps:Let take a example of one of Translation Use case where we required to translate one dataset content from one language to other and upload the translated file to same attach revision with specific relation and dataset type. Now the requirement is to make this relation type and dataset type to be configurable. Steps required be performed are as follows.
1) Extracting file to be translated from Teamcenter.
2) Translating file to required language either using custom language translator or third party translator like Google Translator.
3) Upload the file back to teamcenter by creating dataset and upload the file. Attach dataset with given relation to Itemrevision.
Step 1 and 3 will be part of Dispatcher Client whereas step 2 will be part of Module implementation.
Dispatcher Customization:Dispatcher provide java based out of box implementation for extract from TaskPrep for extracting specific data from teamcenter. Also it provides OOTB Data Loader for loading of output file automatically. This auto behavior can be controlled without writing piece of code by configuring Translator.xml for the specific translation service. The entire class document related to Dispatcher Client can be found under docs folder inside Dispatcher Client root directory. Also sample implementation can found in sample folder under Dispatcher Client root directory.
Implementation:
Dispatcher Framework provides two main interfaces for customization the translator.
1) Taskprep class for extracting the file from Teamcenter. Implementation to prepare task to submit to Translation serverice.
2) DatabaseOperation class for loading the translated files to Teamcenter.
Usually both the class required to implemented for new translation service. Taskprep is the first called when a translation request is created and Dispatcher then find the specific task prep implementation for correspondent translation service request. Once the Translation is done by Module ,the dispatcher invoke DatabaseOperation implementation for the given service for upload of data to team center. In our example if we required to convert text document from one language to other then the task prep with first extract Document from target Dataset and put the file in staging location. Once the module complete the translation the Database Operation class will be invoked and the translated file will be uploaded to Teamcenter with specified dataset type and attach to target object with relation.





Extraction Implementation : Taskprep implementation is done by extending com.teamcenter.ets.extract.TaskPrep which is an abstract implementation of extraction. The abstract class has some member variable which encapsulates all detail of translation request as well staging directory location. Some of key member are
request: The request object for the current extract session
stagingLoc : The staging location in which all the files will be place.
The function which is called execution is prepareTask(). It is defined as abstract for Taskprep class and required to be implemented by all extending classes.
prepareTask() is function required to be implemented.
Pseudo implementation: Usually in implementation we access the target object called primary object and its associate object called secondary object through current request object. Following are pseudo code for same.
m_PrimaryObjs = request.getProperty("primaryObjects").getModelObjectArrayValue();
m_SecondaryObjs = request.getProperty("secondaryObjects").getModelObjectArrayValue();
ModelObject are SOA base wrapper class for all Teamcenter objects . primaryobject are object in teamcenter which selected for Translation and secondary object are those object which are associated with target through relation. For example in case of language translation we decided to that text file can be target object (Primary object) and the Item Revision to which is associated will be then secondary object.
Once we get the primary object then the Name reference file required to be extracted from dataset and put in staging location. This is done through SOA call to teamcenter. Sample code snippet
Dataset dataset = (Dataset) m_PrimaryObjs[i];

ModelObject contexts[] = dataset.get_ref_list();
ImanFile zIFile = null;
for(int j = 0; j < contexts.length; j++)
{
if(!(contexts[j] instanceof ImanFile))
{
continue;
}
zIFile = (ImanFile)contexts[j];
m_InputFile = TranslationRequest.getFileToStaging(zIFile, stagingLoc);
Also required to create a Translation request detail, which is referred by Module for translation. For example in Language translator usecase we would required to have option where user can provide from which language to which language a translation is required. This is done by having Translator Argument while creating translation request by user. This can be retrieve and further process in Taskprep. The snippets for accessing the argument are as follows.
Map translationArgsMap = TranslationRequest.getTranslationArgs(request);This Map contains an Argument as key and its value as value. Also Taskprep can create its own argument based on process which can be used by module or database loader for further processing.
translationArgsMap.put(argumentKey, argumentValue);For example in Language Translator we would required to change the character set to specific value based on translated Language selected by user.
The translation request detail is created as xml file which resides in staging director under unique task id. The sample will look like this


Basically this xml contain user argument which required for Translation. For example above the option provide from and to language. This is used by module for translating. Also it has detail of corresponding dataset, item and item rev. This can be populated but not always required as in above example we also population uid of primary and secondary object which can be used for loading the translated file to with specific relation to an object.

Loader Implementation: Database loader is implemented by extending com.teamcenter.ets.load.DatabaseOperation abstract class. Load() function required to be implemented for a translation service. The DatabaseOperation class has Translationdata transData attribute which encapsulate all translated request data. From translation data we can get all information we populate during extraction (taskprep). For example from Transdata you can get result file list from translation. This help for loader to load all file in teamcenter. The pseudo code for same will

TranslationDBMapInfo zDbMapInfo = transData.getTranslationDBMapInfo();
List zResultFileList = TranslationTaskUtil.getMapperResults(zDbMapInfo, scResultFileType);

Where TranslationTaskUtil is utility class provide various generic facility. ScResultFileType is expected file type for translation. User and Taskprep option can be access through TranslationTask which is a member of DatabaseOperation. The pseudo code for same will

TranslatorOptions transOpts = transTask.getTranslatorOptions();
TranslatorOptions provide encapsulation of all option with Key and Value pair. The map can be access through

Option transOption = transOpts.getOption(i);
if(transOption.getName().equals("SomeoptionName"))
strOutputType = transOption.getValue() ;

Uploading of all result file is done through SOA call to Teamcenter. Dispatcher Framework provides various helper class which encapsulate the SOA call to Teamcenter. If requirement can’t be fulfilled through helper class then SOA can directly be called for the same. 0ne of helper class is DatasetHelper class which provides all functions related to dataset. One of function is which create new dataset for all result file list and attach it to primary or secondary target with a given relation. The pseudo code for same will be
zat
dtsethelper.createInsertDataset((ItemRevision)secondaryObj, (Dataset)primaryObj, “datasettype” , “releationtype”, “namereferencetype”, “resultdir”, “filelist”, flag to insert to source dataset or itemrevision);
Jar File Packaging and Dispatcher configuration:
We required to create a Jar file for the custom dispatcher code. Also in JAR file should have a property file which defines implemented Taskpreperation and Database loader class initiated through reflection mechanism in dispatcher framework. This is sample property file.

Translator.”serviceprovide”.”translation servicename”.Prepare=packagename.TaskPrep class name

Translator. .”serviceprovide”.”translation servicename”.Load= packagename.DatabaseOperation class name

Service provider is the name of service provided, for example OOT translation service it is Siemens. It is configure as preference called ETS.PROVIDERS. Translation Servicename is the name of Translation service which configure in Module config and in Teamcenter preference ETS.TRANSLATORS.

In case of Language translation usecase service provider name can exampletranslation and service name can be examplelanguagetranslation. The content for property for this will be

Translator. ExampleTranslation examplelanguagetranslation.Prepare=packagename.LanguageTransTaskPrep class name
Translator. ExampleTranslation.examplelanguagetranslation.Load= packagename.LanguageTransDatabaseOperation class name

Once the JAR package is created it required to put is DispatcherClient\lib folder. Also service.property under DispatcherClient\config folder required to be update with the property file name

import TSBasicService,TSCatiaService,TSProEService,TSUGNXService,TSProjectTransService, ourpropertyfilename

This will load all classed when DispatcherClient is started.

Module Customization: Module customization usually has following steps.
1) Create Translation programs or indentify third party application which is used for translation.
2) Usually we create a wrapper of bat file to run the Program. This is not required but most of the time we required to set some environment like Teamcenter root or Tc data.
3) Add the service detail in Translator.xml under Module\conf regarding the service.

Let take our example usecase of Language Translator. We are going to use third party translator for example Google Translator for it. There are some Java API which invoke remote google translator. We will required to create a Java wrapper on top of this API to create our Teamcenter Translator. Our Program will take input file , outputfile with location, language from which it will be translate and to language which required to translated.

Probably we also required to create bat file for setting a JAVA_Home and other env variable. This bat will also take this four parameter required for our program. Our sample config required to be added inTranslator.xml is as follow

Where ModuleRoot is a key word belong to module base directory.


The above config when read by module framework will convert to call

$ModuleRoot/Translators/examplelanguagetranslation/examplelanguagetranslation.bat –input=”absolute file location” –outputdir=”outputdir” “from_lang value” “to_lang value”
Also a workflow action handlercan be created to integrate a Language Translation with Change Management and other business process. ITK api required to be used is
DISPATCHER_create_request.
Handler can implemented in such a way that it can take a argument of Translation Provider, service name and various options supported by translation service. This will provide the flexibility to reuse the handler for other translation service.

This is from my side on Teamcenter Translation framework. Hope it might be helpful for people who want to quick start on translation service. Any comments are welcome.

See Also :
Dispatcher Framework
Configuring Translator

Saturday, September 24, 2011

Configuring Translator

In my last blog I discuss Dispatcher Framework of Teamcenter. In this blog I will discuss in detail on each of module and how to configure COTS translation service. As discussed in last blog there are three components for Translator.
1) Scheduler
2) Module
3) Dispatcher Client
Each module run as independent service and in different server. Component communicate through RMI
Scheduler : This component act as Moderator between Module and DispatcherClient. Module and DispatcherClient required configuring to communicate to Scheduler. Once installed the scheduler directory structure created as shown below.







The transscheduler.properties file stored in config subdirectory which defined port and other properties for Scheduler. The scheduler can be run by runscheduler.bat present in bin subfolder. Lib store all Jar file correspond to scheduler. Most of time there is no change required to be done in Scheduler once installed.
Module: Module is the component which does actual translation. It interacts with Scheduler to get the Translation Task. Module invokes specific Translation based on Task information and Translator service available to it. . Once installed the Module directory structure created as shown below.






The conf folder has translator.xml which contains all Translation service details for Module. Modules publish this service to Scheduler. Based on this information Scheduler in turn dispatch specific Translation task to the Module.
Contain of translator.xml look like as follow






The XML show subsection of configuring one of Translation service in Module. Every translation service in Module starts with name of service as element in translator.xml. As in above example Translation service JtToCatiaV5 is configured under provider SIEMENS with service name jttocatiaV5. The above configuration show presently the service is not active. Means Module will not publish this service. If you want to make this service active then required to change the isactive value to true.
The TransExecutable element define the location of Translator root directory and name of exe or bat required to call for translation. In above example jttocatiav5.bat will be invoked located at d:/Siemens /DISPAT~1/Module/Translators/jttocatia/v5 location.
The Options element define what argument required to be pass. In above example two arguments with –i and –o with their value will pass by Module will to jttocatiav5.bat. The Module will convert the above config while calling actual jttocatia translator as follows.
d:/Siemens /DISPAT~1/Module/Translators/jttocatia/v5 location/ jttocatiav5.bat -i {Input file dir} –o {outputfiledir}
The inputfile dir and outputfile dir will get at run time through DispatcherClient component through Scheduler Component.
The FileExtensions define the input and output extension expected for this Translator. In this example the input file will of .jt and output is of type .CATPart or .CatProduct.
Apart from this Module has transmodule.properties file which detail has related scheduler rmi port and staging root directory configured. The Module can be run by runmodule.bat present in bin subfolder. Lib store all Jar file correspond to scheduler.
DispatcherClient: This is the component which interfaces with Teamcenter. Its main job is to fetch data files Teamcenter and upload translated file to Teamcenter. Once installed the DispatcherClient directory structure created as shown below.



Lib folder contains jar files for Dispatcher Client. All customize Jar file for new translator service required to be copy in this folder. Detail regarding Translator customization will be discuss in my next blog. The config contain all file related to Translator configuration for DispatcherClient. DispatcherClient.config contains information related to Translator server like rmi port of Scheduler, staging directory etc. Where as Service.properties contain information required to connect to Teamcenter like host, Translator Teamcenter user, tcserver port number etc. For activating COTS translation usually we don’t required to change in DispatcherClient component.




Activating COTS Translation Service: Siemens provided many Cots service with installation. But this service are by default are not activated. Also some of the Translator service required core service to be installed. For example previewservice translation can only work if visualization and vis translator is installed. Following steps required to be done to make any COTS translator active
1) Modify Translator.xml inside Module/conf folder. Make service active by changing isactive attribute to true. See Translator.xml image above.
2) Change the core bat file for that translator present in root service directort and provide all required variable value like tc_root, tc_data,ug_root etc. For example in jttocatiaV5 translator required to provide UGS_LICENSE_SERVER in catiav5tojt.bat file.
3) Verify following preference are present in Teamcenter to enable it in Teamcenter.

a. ETS.PROVIDERS : Provide list of Translator provider. SIEMENS is a default value. So in case of configuring COTS translation, will required to change any thing.
b. ETS.TRANSLATORS.. This provide list of Translation available by provider. So for COTS configuration required translation service name for preference ETS.TRANSLATORS.SIEMENS. For example for jttocatiav5 is required to be added here.
c. ETS... : Provide list of Business Object type on which this translator can be invoked as primary object. For example in case of jttocatiav5 it should be invoked for JT dataset type only. Then the preference required to be set is ETS.DATASETTYPES.SIEMENS.JTTOCATIAV5 with multi-value and value as JtSimplification and other dataset type which support JT files.
There are other preference which are optional.


See Also :
Dispatcher Framework
Customizing Translation Services

Sunday, September 11, 2011

Teamcenter Dispatcher Framework.

Presently I am working Translation Service of Teamcenter. Though to share my learning experience with you people. Translation service comes as a Dispatcher Service under teamcenter installation. Translation service is nothing but to translate one file format to other. For example Doc to PDF. The broader task any translation are as follows.
a) Extract Data from Teamcenter.
b) Execute Translation.
c) Load translated result to teamcenter.
Hence the Dispatcher Service of teamcenter has three main components.
1) Scheduler
2) Module
3) Dispatcher Client
There is one more component called Dispatcher Admin which is basically used for Admin activity and it is optional component. Each of the above three component run independently and can be run as service or in console. Each component can be run in different server. As name suggests scheduler manage the whole framework by interacting between Module and Dispatcher client. Dispatcher Client component basically manage extract and loading of data. Module does actual translation. The below diagram depict the Translation Frame work.




Dispatcher Client is the front end of Dispatcher Framework which basically interacts with Teamcenter through SOA for translation request. Teamcenter required to be configuring through ETS preferences for new translation services and object type on which this service is valid. Once the request is received to Dispatcher Client, it processes the request and put all extracted files required to be translated in to directory called staging directory. Staging directory is required to be configured during Dispatcher Service Installation. In staging directory a unique subfolder is created for each request by Dispatcher client based on Task ID generated during user request in Teamcenter. Once Dispatcher client completes the extract, it inform scheduler for translation processing. Scheduler in turn informs Module to start processing the task. Module translate the file and put the output in staging directory. Once completed schedule ping the Dispacher client which load translated file back to Teamcenter.
Siemens PLM provide lot of out of box translation service which required to be make active. In next blog I will provide more detail about each component and there configuration.

See Also :
Configuring Translator
Customizing Translation Services

Saturday, September 10, 2011

Introduction

Thought to start blog on my professional skill. So that I can share my expertise in PLM and in teamcenter specific. Will try to start with specific subject and hope to see comments and accordingly will post regularly. See all of you soon with my first blog in this section