Often there is a large amount of data in the company that you should transfer to the CMDB. This can be in the form of existing Excel tables that have been continuously maintained. There are also various applications and services that include important information. For example, think about Wiki systems (knowledgebase) or inventory tools. These also usually offer a function for exporting the data. If this data is available as CSV files, the import into i-doit can be automated very quickly.
If there is no API or other export possibilities, you must access the database directly. Often the data is stored in a database such as Postgresql, MySQL, MariaDb or MongoDB. From the databases you can easily compile information as SQL-query according to your requirements and extract it regularly (e.g. as cronjob).
What should you consider in advance? First you must identify suitable data sources.
Everything that can be automatically extracted from one system, you can also import regularly and automatically via the i-doit CLI (Command Line Interface). In this way, you will achieve automated updating of data and at the same time reduce the documentation effort of the employees.
The first step is to identify your data sources. Here, it is useful to ask the following questions. Which data do you want to import and where is it currently stored? Is this data regularly updated by a member of staff or by another department? Should this data be exported regularly from a database or application?
Check the possibilities and if necessary create a test run for your query. One problem that can occur here, is that you can’t extract the information in the desired form. An example of this could be that you can only extract the host name but not the FQDN. In this type of case, we recommend that you adapt the export to the appropriate form using a script.
Here is an example:
We export the host name, manufacturer, model and serial number of our clients from an inventory tool. However, we would like to import not only the host name, but later also the complete FQDN.
For this case we create a small Powershell-Script which reads and extends the existing CSV-file. In order to achieve an automatic execution, you should include this script either locally or on our Windows server in the task scheduler.
$Data = Import-Csv -Path "C:\data\Clients.csv" -Delimiter ';' | ForEach-Object { $_.Hostname = $_.Hostname + '.i-doit.com'; $_ } | ConvertTo-Csv -NoTypeInformation -Delimiter ';' Out-File "C:\data\Clients-erweitert.csv" -InputObject $Data
Now that your CSV file is available in the desired format, you start a test run and create your necessary CSV profile for the import. To do this, you upload the extended CSV file under Tools -> Import -> CSV Import and configure the mapping.
Important to note, is to make sure to save the configured mapping as a profile for the import.
After the import process has begun, the objects are created or updated. Then you should switch to the console. In order to configure an automatic CSV-import, the first step is to know the ID of your CSV-profile. To do this, you should enter the following command in the console:
sudo -u www-data php /var/www/html/console.php csv-profiles --user admin --password admin --tenantId 1
In this example, our profile has the ID 9. Depending on how many different CSV files are going to be imported, you might find it helpful to organise them into different files. For this purpose, you should create a new cron with sudo nano csv-clients and create a new cron job. Here it is imperative to specify the correct import profile and the path to your CSV file.
* * * * * www-data /usr/bin/php /var/www/html/console.php import-csv --user admin --password admin --tenantId 1 --importFile /var/www/html/imports/1/clients-erweitert.csv --importProfileId 9 --csvSeparator ";" --multiValueMode column
Finally, all you need to do now is to copy the necessary files regularly to the path created above, via cURL. The automatic import of CSV data is therefore a fast and cost-effective alternative, as you don’t require an additional interface.