Under Consideration

initial data load

There is a very common need to load initial data into a master and/or parameter entities. E.g. an Excell sheet containing list of clients or SKUs.

Direct load into custom entity tables via SQL isn't possible becase of primary keys and probably other internal logic.

For parameter entities, replication is a decent workaround: we can load data into temporary DB table, then set up mapping and run replication once.

For master entites, there are two options: either enter data manually or write custom code to load data. Both are unacceptable - we spend resources either for mechanical work or to develop the program code that will be thrown away after one-time usage.

Please provide a flexible utility able to load data into a custom entity from e.g. a csv file. No need to care about foreign keys - we can always create a file with the right foregin ids. The primary surrogate key values should be assigned in accordance with Bizagi internal rules.

Comments (7)

photo
0

I've been wondering about this issue as well. My thought was to create the tables with data in a database, then virtualize them in BizAgi. Once virtualized, BizAgi adds its own primary key, so it seems the "internal" stuff is taken care of. Then remove the virtualized source. After that, then run some SQL to modify the data (for example, to enter the correct foreign keys that tie to the newly created BizAgi primary keys).

photo
0

It may be simpler to define an entity as parameter, perform replication first and then transform it to master.

photo
1

Bizagi has a way to load data into a master table when the data is encoded as xml. All the foreign keys must be resolved before, time consuming.

I've looked into using the automatic testing tool for this, but mapping to the JSON .test file is very time consuming (similar to a mail merge style).

Right now we are looking into making a staging database that matches the project datatabase, put the data in there using various scripts. Then import into the Bizagi database using SQL commands (INSERT INTO dbo.master_table (field list) SELECT field list FROM staging.table).

However all the business rules coded in the forms don't get run. Need a way to make sure the loaded data meets all the rules - so far we have to navigate into each forms (custom process using reusable forms).

The user running the forms needs to know how to handle the validation errors. Would have been better to use the automatic testing tool to log which Primary Keys work, which don't.

However the .test file (json format) stores data values & collection values, and complains if the data doesn't match what the Search brought up.

photo
1

Dear Users,

A good way to load an entity - Parameter or Master - is using a SOA method: saveEntity. For further information: http://help.bizagi.com/bpm-suite/en/index.html?saveentityasstring.htm

This Web method is used to update information in an Entity in Bizagi's data model (from an external application).

Updating information in an Entity involves: inserting new records, modifying existing ones, or disabling (logical deletion) existing records.

Regards

photo
1

Consider - you have data elsewhere that Bizagi can read from (ie through a view).

If all your "add" or "add/edit" forms are reusable, you can make a new process that daisy-chains these forms, one form per activity, in the right order (parent - children, and so on).

This new process, every process form does an "auto-next" - simply by activating Form Buttons, and using any hidden attribute from the process table on the form. The Action, when this hidden field is Empty Or Not Empty, Click on Next.

Now in the On Enter of any of these forms, set in the process table your entity object to be what you want. When the form runs, it will display your reusable form with the appropriate data of that entity, and auto-click the next button.

However - should that reusable form complain about any business rule or validation rule not met, the process will pause, allowing a user to fix the data, click on Next, and the daisy-chain continues.

- or attach an Error Event to the process form, log the bad entity, go back to the begging of your loop, select another entity, and so on.

We are basically inserting data directly into master entities using ETL, but setting some attributes so we know which are "todo" and which were done. Like dETLLoadedWhen, dValidatedWhen.

Then a GetEntity list for "dETLLoadedWhen is not null AND dValidatedWhen is null".

For every process form that the next does work, On Exit, we set with an expression the dValidatedWhen to Today.

Now our customer has an easy way to know what is valid after the ETL load (all business rules met). Note that each form will be in "Edit" mode, so no Change Event will be triggered.

Another variation would be to be in "Add" mode and use an Action (when attribute is empty, run script expression) to get data from another source and populate the form attributes. The auto-next will save the screen, or processing will stop on a validation error. Your customer simply runs this somewhere with some supervision.

You just have to be creative in "how" you obtain your data for adding or editing. We are using TSQL Views, and the view "talks" to our Bizagi project database and the "other" database (a staging).

You need a common "business key" between each entity (source & destination) so that the TSQL View has something to compare against.

Then in Bizagi, your GetEntityList grabs the first one, until there are no more left.

If your "source" is a flat file or Excel, I suggest loading that data into your SQL database as a staging database, and read from that into your Bizagi project. This allows you to correct for missing foreign keys on parameter tables, for example. Also to clean out bad data.

Cheers from Canada.

photo
1

The Bizagi PK starts at #1 and works it way up - across all tables it seems as a global variable. So you don't get a #1 PK in two different tables.

What if we load, for example MSSQL-to-MSSQL, in a very high range? Like start at 10 million, then our scripts do Select Max(idEntityName)+1 From EntityName.

So the Bizagi PK never catches up to the PKs we created "behind the scenes".

photo
1

I found a solution that, if not trivial, is better than coding in C #:

a) Use a Excel Spreadsheet to get the data and generate a XML String formated especifically to call saveentityasstring via Bizagi SOA API;

b) Use a simple Macro to get the strings formated for each record to generate xml files and save then in a folder. Something like this:

RECORD1.XML

<tem:saveEntityAsString>

<tem:entityInfo><![CDATA[<BizAgiWSParam><Entities>

<Contrato>

<CNPJ>111</CNPJ>

<NomeFornecedor>P4Pro</NomeFornecedor>

<DataFinalVigencia>2020-12-31</DataFinalVigencia>

<Status>2</Status>

</Contrato>

</Entities></BizAgiWSParam>]]>

</tem:entityInfo>

</tem:saveEntityAsString>

RECORD2.XML

<tem:saveEntityAsString>

<tem:entityInfo><![CDATA[<BizAgiWSParam><Entities>

<Contrato>

<CNPJ>222</CNPJ>

<NomeFornecedor>P4Pro</NomeFornecedor>

<DataFinalVigencia>2020-12-31</DataFinalVigencia>

<Status>2</Status>

</Contrato>

</Entities></BizAgiWSParam>]]>

</tem:entityInfo>

</tem:saveEntityAsString>

And so on, one file for each record we want to load.

3) Use SOAPUI, a utility to acess Web Services and create a Test Script to call Bizagi SOA using the created files. It is relativelly easy following this article: https://www.codeproject.com/Articles/820414/Automating-SoapUI-using-Groovy-A-Walk-Through.