Many global organisations are starting to build central finance (CFIN) systems to map, harmonise and integrate their transactional and master data into one place from various legacy ECC and other systems. A central finance implementation is not straightforward and requires financial data being accurately loaded and replicated 100% of the time.
Here are some tips that may help with a successful implementation:
- When defining your universal data model and mapping for your CFIN target system, perform a comprehensive source system mapping and config analyses in all source systems to determine key mapping objects usage and possible replication conflicts. This includes, for example, company codes structures, ISO currencies, GL account settings (eg, open item managed), bank account validations (SWIFT compliant), payment term usage, profit centres usage, material types, COPA requirements (account based or cost based), config of classic or new GL, usage of business areas, month-end cycles, tax requirements/other local compliance reporting, depreciation areas, etc. Build simulations in your staging environment to simulate the future state mapping and reporting to eliminate any conflicts or errors you may face before initial load and replication is activated.
- Reduce system change and transport overhead in your CFIN landscape and ensure all standard and custom mapping tables and objects are transportable. This will ensure your dev, test and pre-prod and prod landscape is always in sync for CFIN mapping and ensure efficiency in your system change and transport request processes.
- Monitor, manage and clear source open items. Clear all your old items and consider changing open item managed GL accounts that are not bank clearing accounts, goods receipt/invoice receipt, sub-ledger, reconciliation or control accounts, etc. At the same time, use historic extracted open and cleared transactional data to check the completeness and accuracy of your standard and customised mapping. Create a mapping register and identify the objects that are referenced in your transactional data (eg, payment method, tax codes, currency keys, etc). Many replication issues occur when transactions are processed using reference data that is not used frequently and missed in your mapping set-up.
- From a program perspective, do integration testing from a production source system as early as possible. Most SAP programs prefer doing traditional integration testing from a quality system; however, as you are integrating from a source production system where limited config changes are made to a quality central finance environment, the risk is low. In addition, consider a regular refresh of your production target system to this quality central finance target environment, as this will optimise your regression testing.
- Most SAP projects shift the responsibility of financial reconciliations for data migration to the business. These can take a long time and create pressure on your project and client to work overtime and weekends. Consider creating an automated ABAP program or an SQL program to do automatic reconciliations with RFC from source to target.
- Many organisations do not put much priority in sorting out their universe of poor data prior or during the implementation. Most enterprise implementations and SI partners do not wait for the business to clean and align data beforehand as these projects are simply too expensive. There is, however, an opportunity to build golden records, remove old data, de-duplicate master data and clean critical fields beforehand. Make sure you set up a CFIN data cleansing team, use machine learning and data cleansing automation and have data cleansing KPIs as part of your entry criteria. This will also help with the simplification of your universal data model (UDM).
- Once the data is loaded and replicated from the source systems to the target system, business will continue using the source systems. It is critical to ensure that transactional and master data continue to replicate to CFIN accurately all the time. This requires continuous diligence from a replication point of view. Any changes in standard and custom mapping will impact replication and increase the likelihood of failure. Establish a joint technical and business support and governance team to not only look after the data replication but also look at any changes that will require an update to your UDM and standard and custom mapping.
About the author: Jaco Boshoff is a GlueData Europe and UK director and has over 25 years' SAP project experience in various techno functional consulting roles.
Share
GlueData Services
GlueData specialises in SAP master data and enables companies to master their data through a range of services including data migration, data integration, SAP Master Data Governance (MDG), data archiving and data insight.