Skip to content

v0.3.0

Latest
Compare
Choose a tag to compare
@lsulak lsulak released this 25 Oct 14:22
· 4 commits to master since this release
1b3a032

Breaking Changes 💥

  • Additional data methods of AtumContext uses REST API v2 (incompatibility of Agent of 0.3.0+ with server 0.2.0) by @lsulak in #283
  • Full Flyway integration developed by @benedeki in #276

New Features 🎉

  • Atum server REST API v2 developed by @salamonpavel, @TebaleloS, @lsulak, @benedeki in #140

    • GET /partitionings?partitioning=serializedPartitioning in #268
    • GET /partitionings/{partId}/additional-data in #227
    • PATCH /partitionings/{partId}/additional-data in #221
    • there are many more, but they are silent live for now
  • Introduced response envelopes providing additional metadata (requestId) for REST API v2 endpoints by @salamonpavel in #197

  • Replaced Json4s and Jackson serialization libraries with Circe by @TebaleloS, @salamonpavel, @benedeki in #214

  • Introduced health API endpoint in a form StatusBoard projects expects by @salamonpavel in #282

  • Dockerfile and application configuration verified for deployment with ZIO and Http4s web server by @salamonpavel in #274

  • Dockerfile adjusted to ZIO framework and custom configuration now being passed during docker run, i.e. independent of the sbt build and docker build by @lsulak in #279

Silent Live 🤫

  • Introduced the Reader module to make reading of information stored in Atum server easy. by @benedeki in #248 (not publsiehd yet, only in code-base)
  • Atum server REST API v2 endpoints developed by @salamonpavel, @TebaleloS, @lsulak, @benedeki in #140
    • There are numerous other endpoints implemented beside those mentioned above. We yet discourage from their usage though, as they are subject to change, particularly their payloads.

Known Issues ⚠️

  • Dependency shading might be needed when using the Agent in Spark environment (especially when some Hadoop dependencies are in use as well, for example if you package the application that contains the Agent and use such JAR in Spark Submit command). Here's the suggested project code snippet for Maven:
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>${maven.shade.plugin.version}</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <relocations>
                        <relocation>
                            <pattern>okhttp3</pattern>
                            <shadedPattern>shaded.okhttp3</shadedPattern>
                        </relocation>
                        <relocation>
                            <pattern>okio</pattern>
                            <shadedPattern>shaded.okio</shadedPattern>
                        </relocation>
                        <relocation>
                            <pattern>sttp</pattern>
                            <shadedPattern>shaded.sttp</shadedPattern>
                        </relocation>
                        <relocation>
                            <pattern>cats</pattern>
                            <shadedPattern>shaded.cats</shadedPattern>
                        </relocation>
                        <relocation>
                            <pattern>shapeless</pattern>
                            <shadedPattern>shaded.shapeless</shadedPattern>
                        </relocation>
                        <relocation>
                            <pattern>kotlin</pattern>
                            <shadedPattern>shaded.kotlin</shadedPattern>
                        </relocation>
                    </relocations>
                    <filters>
                        <filter>
                            <artifact>*:*</artifact>
                            <excludes>
                                <exclude>META-INF/*.SF</exclude>
                                <exclude>META-INF/*.DSA</exclude>
                                <exclude>META-INF/*.RSA</exclude>
                            </excludes>
                        </filter>
                    </filters>
                </configuration>
            </plugin>

Full Changelog

v0.2.0...v0.3.0