You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- MariaDB (includes support for MySQL, `mariadb-java-client v3.5.7`)
25
+
- Microsoft SQL Server (`mssql-jdbc v13.2.1.jre11`)
26
+
- Snowflake (`snowflake-jdbc v3.28.0`)
27
+
28
+
## Custom JDBC Drivers
29
+
30
+
In addition to the bundled JDBC drivers, you can register custom JDBC drivers.
31
+
The following sections describe the required configuration.
32
+
33
+
### Download Custom JDBC Driver
34
+
35
+
Download the JDBC driver for each database management system that you want to connect to.
36
+
[Integrations](../../../../build/integrations/index.md) provides links for well-known systems and lists those that are actively used with Corporate Memory.
37
+
38
+
### Provide a Custom JDBC Driver
39
+
40
+
Consult your solutions manager or DevOps specialist for options to copy or inject the JDBC driver `jar` into a Corporate Memory deployment.
41
+
Depending on the deployment model, suitable options include:
42
+
43
+
- The Docker Compose package `cmem-orchestration` mounts the folder `./conf/dataintegration/plugin/` into the DataIntegration container.
44
+
The configuration snippets below assume this location, which maps to `/opt/cmem/eccenca-DataIntegration/dist/etc/dataintegration/conf/plugin/` inside the container.
45
+
- A dedicated _Build project_ in which the driver JAR files are uploaded as project file resources.
46
+
- Dedicated file or resource mounts in a Docker Compose or Helm/Kubernetes configuration.
47
+
48
+
## Driver Registration
49
+
50
+
A custom JDBC driver must be registered in the DataIntegration configuration file `dataintegration.conf`, in the `spark.sql.options` section.
51
+
The following example shows how to register a custom JDBC driver for Databricks:
0 commit comments