MongoDB and Java 8 Agenda Java8 Main Features MongoDB + Java8 Few - - PowerPoint PPT Presentation
MongoDB and Java 8 Agenda Java8 Main Features MongoDB + Java8 Few - - PowerPoint PPT Presentation
MongoDB and Java 8 Agenda Java8 Main Features MongoDB + Java8 Few Examples RX Driver 3 Java 8 MongoDB Java Driver is JAVA6+ Complaint Java 8 Features and Improvements Lambda Expressions New Date API Stream API Type
MongoDB and Java 8
3
Agenda
Java8 Main Features MongoDB + Java8 Few Examples RX Driver
Java 8
MongoDB Java Driver is JAVA6+ Complaint
6
Java 8 Features and Improvements
- Lambda Expressions
- New Date API
- Stream API
- Type Annotations
- Compact Profiles
- Security Enhancements
- JavaFX Improvements
- Nashorn JS Engine
- Unicode Enhancements
- Selector Provider (IO & NIO)
- Concurrency
- …
http://www.oracle.com/technetwork/java/javase/8-whats-new-2157071.html
7
Java 8 Features and Improvements
- Lambda Expressions
- Stream API
- New Date API
- Type Annotations
- Compact Profiles
- Security Enhancements
- JavaFX Improvements
- Nashorn JS Engine
- Unicode Enhancements
- Selector Provider (IO & NIO)
- Concurrency
- …
http://www.oracle.com/technetwork/java/javase/8-whats-new-2157071.html
8
Lambda Function
- Anonymous Functions
– Not bounded to an identifier – Inline definition – Passed as argument to higher-order functions
- Functional Java FTW
9
Lambda Function
… .map(line -> Arrays.asList(line.split(SEPARATOR SEPARATOR))) ))) … i -> document.put(headers.get(i), values.get(i) )
- Anonymous Functions
– Not bounded to an identifier – Inline definition – Passed as argument to higher-order functions
- Functional Java FTW
10
Stream API
- Streams are free flowing
sequence of elements – Pipeline kind of processing
- Starts with a source of data
(Collections) – Processes elements of that data source in a parallelized fashion – Intermediary processing steps
- Generally implemented
by lambdas – Terminator operators
- Streams are immutable!
try try(BufferedReader reader = new new BufferedReader BufferedReader(fr fr)){ )){ return return reader reader.lines .lines() () .skip(1) .skip(1) .map( .map(line line -> {
- > {
Document document = new new Document(); Document(); List<String> values = Arrays.asList(line.split(SEPARATOR SEPARATOR)); )); IntStream.range(0, Math.min(values.size(), headers.size())) .forEach(i -> document.put(headers.get(i), values.get(i) )); return return document document; }).collect(Collectors.toList());
11
Stream API
public public List<Document> List<Document> readRecords readRecords(List<String> (List<String> headers headers) { ) { try try ( (FileReader FileReader fr fr = = new new FileReader FileReader(this this.source source); ); BufferedReader reader = new new BufferedReader BufferedReader(fr fr)) { )) { return return reader reader .lines() .skip(1) .map(line -> { Document document = new new Document(); Document(); List<String> values = Arrays.asList(line .split(SEPARATOR SEPARATOR)); )); IntStream.range(0, Math.min(values.size(), headers.size())) .forEach( i -> document.put(headers.get(i), values.get(i))); return return document document; }).collect(Collectors.toList()); } catch catch ( (IOException IOException e) { ) { throw throw new new UncheckedIOException UncheckedIOException(e); ); } }
12
New Date API
- There is no native support for new Date API
- Our Codec
Codec API gives the possibility map this data type ( Instant. Instant.class class ) to encode to a driver supported data type and decode back to the original data type
InstantCodec instantCodec = new new InstantCodec InstantCodec(); (); Map<BsonType, Class<?>> replacements = new new HashMap HashMap<BsonType BsonType, Class<?>>(); , Class<?>>(); replacements.put(BsonType.DATE_TIME DATE_TIME, , Instant. Instant.class class); );
13
New Date API
public public class class InstantCodec InstantCodec implements implements Codec<Instant> { Codec<Instant> { public public void void encode( encode(BsonWriter BsonWriter writer writer, Instant , Instant value value, EncoderContext encoderContext) { //will store Instant has Epoch Milliseconds writer.writeDateTime(value.toEpochMilli()); } public public Class<Instant> Class<Instant> getEncoderClass getEncoderClass() { () { return return Instant. Instant.class class; } //return back on Instant.class public public Instant decode( Instant decode(BsonReader BsonReader reader reader, , DecoderContext DecoderContext decoderContext decoderContext) { ) { return return Instant. Instant.ofEpochMilli
- fEpochMilli(reader
reader.readDateTime .readDateTime()); ()); } } http://mongodb.github.io/mongo-java-driver/3.0/bson/codecs/
14
New Date API
public public class class InstantCodec InstantCodec implements implements Codec<Instant> { Codec<Instant> { public public void void encode( encode(BsonWriter BsonWriter writer writer, Instant , Instant value value, EncoderContext encoderContext) { //will store Instant has Epoch Milliseconds writer.writeDateTime(value.toEpochMilli()); } public public Class<Instant> Class<Instant> getEncoderClass getEncoderClass() { () { return return Instant. Instant.class class; } //return back on Instant.class public public Instant decode( Instant decode(BsonReader BsonReader reader reader, , DecoderContext DecoderContext decoderContext decoderContext) { ) { return return Instant. Instant.ofEpochMilli
- fEpochMilli(reader
reader.readDateTime .readDateTime()); ()); } } http://mongodb.github.io/mongo-java-driver/3.0/bson/codecs/
15
New Date API
InstantCodec instantCodec = new new InstantCodec InstantCodec(); (); Map<BsonType, Class<?>> replacements = new new HashMap HashMap<BsonType BsonType, Class<?>>(); , Class<?>>(); replacements.put(BsonType.DATE_TIME DATE_TIME, , Instant. Instant.class class); ); CodecRegistry cr = CodecRegistries.fromRegistries( CodecRegistries.fromCodecs(instantCodec), CodecRegistries.fromProviders(documentCodecProvider), MongoClient.getDefaultCodecRegistry()); // add the new code registry has option. MongoClientOptions option = MongoClientOptions.builder() .codecRegistry(cr).build(); mc mc = = new new MongoClient MongoClient("localhost:27017" "localhost:27017", , option
- ption);
); collection collection = = mc mc.getDatabase .getDatabase("dates" "dates"). ).getCollection getCollection("sample" "sample"); ); Document doc = new new Document( Document("java8date" "java8date", , Instant. Instant.now now()); ()); collection collection.insertOne .insertOne(doc doc); );
Extra: RX Driver
ReactiveX
Observable Observer
18
MongoDB Reactive Streams Driver
- http://mongodb.github.io/mongo-java-driver-rx/
- Observer based rather than callback based! (Cold Observables)
- Built upon the MongoDB Async Driver & RxJava
Join the Java Courses
20
https://university.mongodb.com/courses/M101J
Engineering Sales & Account Management Finance & People Operations Pre-Sales Engineering Marketing
Join the Team
View all jobs and apply: http://grnh.se/pj10su
Obrigado!
Norberto Leite Technical Evangelist norberto@mongodb.com @nleite
https://github.com/nleite/java8demo
A SIMPLE GUIDE TO USING AKKA PERSISTENCE
Joost Heijkoop
WHAT WILL THIS BE ABOUT?
Using Akka with Scala Event Sourcing How to add Akka Persistence How to do Serialization with Stamina
WHO AM I?
Joost Heijkoop I like to create/build/learn/share/teach stuff I'm a Software developer, Full Stack Developer, Back-end Developer, Consultant, ... I Work at Xebia Amsterdam.scala meetup (SUG) organisor Serial meetup / conference attendee
BUILD STUFF!
AKKA + SCALA
Akka is an Actor Framework, similar to Erlang Akka is message driven, "extreme" OO Scala hybrid programming language on the JVM
AKKA + SCALA
class Parrot extends Actor { var count = 0
- verride def receive: Receive = {
case message: String => { count = count + 1 println(s"$count: $message") } } } val parrot = system.actorOf(Parrot.props) parrot ! "Hello" parrot ! "Hello" 1: Hello 2: Hello
WHAT IS EVENT SOURCING
All truths spring from recording events The current state is the sum of all events Command -> Event -> Update Recording of delta instead of current state You can replay all of history CQRS (Command Query Responsibility Segregation) - Martin Fowler
EVENT SOURCING WITH AKKA PERSISTENCE
Get a command Create event Persist/record the event to Event Store Update the internal state
EVENT SOURCING WITH AKKA PERSISTENCE
final def persist[A](event: A)(handler: (A) => Unit): Unit
- verride def receiveCommand: Receive = {
case message: String => persist(Incremented(count + 1, message))(update) println(s"$count: $message") } def update: Receive = { case Incremented(newCount, message) => count = newCount }
EVENT SOURCING WITH AKKA PERSISTENCE
class Parrot extends PersistentActor { var count = 0
- verride def persistenceId = "parrot"
- verride def receiveRecover: Receive =
{ case event: Incremented => update(event) }
- verride def receiveCommand: Receive = {
case message: String => persist(Incremented(count + 1, message))(update) println(s"$count: $message") } def update = { case Incremented(newCount, message) => count = newCount } }
AKKA + SCALA
class Parrot extends Actor { var count = 0
- verride def receive: Receive = {
case message: String => { count = count + 1 println(s"$count: $message") } } }
PERSISTED STATE
BEFORE
1: Hello 2: Hello [restart app] 1: Hello
WITH AKKA PERSISTENCE
1: Hello 2: Hello [restart app] 3: Hello
STANDARD SERIALIZATION IS A PAIN
Java serialization is standard
Incremented(count: Int)
- > IncrementedV2(count: Int, message: String)
What you want
Incremented(count: Int)
- > Incremented(count: Int, message: String)
STAMINA
It serializes to a non-binary format The default serialization is JSON Allows you to migrate before deserialization The is a catch, for now
ADD STAMINA
SRC/MAIN/RESOURCES/APPLICATION.CONF
akka.actor { serializers { goto = "goto.PricingAkkaSerializer" } serialization-bindings { "stamina.Persistable" = goto } }
ADD STAMINA
import stamina._ import stamina.json._ import stamina.json.SprayJsonMacros._ case class Incremented(count: Int) extends Persistable val parrotPersister = persister[Incremented]("increment") class PricingAkkaSerializer(persisters: Persisters) extends StaminaAkkaSerializer(persisters) { def this() { this(Persisters(List(Parrot.parrotPersister))) } }
STAMINA MIGRATION
case class Incremented(count: Int, message: String) extends Persistable persist(Incremented(count + 1, message))(update) def update: Receive = { case Incremented(newCount, message) => count = newCount }
STAMINA MIGRATION
import spray.json.lenses.JsonLenses._ val parrotPersister = persister[Incremented, V2]( "increment", from[V1].to[V2]( _.update('message ! set[String]("[empty message]"))) )
STAMINA
PROJECT/BUILD.SCALA
import sbt.{Build, Project, ProjectRef, uri}
- bject GotoBuild extends Build {
lazy val root = Project("root", sbt.file(".")) .dependsOn(staminaCore, staminaJson) lazy val staminaCore = ProjectRef( uri("git://github.com/scalapenos/stamina.git#master"), "stamina-core") lazy val staminaJson = ProjectRef( uri("git://github.com/scalapenos/stamina.git#master"), "stamina-json") }
QUESTIONS
Using Akka with Scala? Event Sourcing? How to add Akka Persistence? How to do Serialization with Stamina? Wat?!?
RESOURCES
Akka Persistence Stamina example Akka Persistence documentation Stamina - Akka Persistence serialization CQRS - Martin Fowler Event Sourcing - Martin Fowler Amsterdam.scala meetup group Xebia
Design Design for Quality ality in in Java 8 a 8
Emil Forslund Speedment, Inc.
About me
- Emil Forslund
- Code Monkey @ Speedment, Inc.
About the project
Speedment is a java tool that
- generates a domain model from a database,
- rganize that model in a graph-like manner, and
- let you write super-fast in-memory database transactions.
About the project
Speedment has been in development for about 5 years and consist of roughly 1800 components. When Java 8 was released, the entire code base was rewritten to make use of all the new language features.
Some words about inheritance
- Created as a means of reducing code repetition almost 50 years ago
- Makes it possible to organize code in a well established structure
And what tends to happen?
- As a system grows, dependency trees tend to grow as well
- Often leads to coupled systems with bad maintainability
- Testability worsens with complex inheritance models
What is a trait?
- A reusable component that can be combined to create classes
- A description of a specific behavior or property
Trait definition?
- A set of methods that implement behavior
- Stateless
- Can be combined using the following operators:
–
Symmetric sum
–
Override (asymmetric sum)
–
Alias
–
Exclusion
An example in Scala
trait HasName { def nameProperty() : Property[String] def setName(name : String) { nameProperty().set(name) } def getName(name : String) : Option[String] = nameProperty().get(name) }
Traits in Java?
Was not possible... until recently with the release of Java 8
Default methods
Using interface default methods, a concept similar to Traits can be achieved Interfaces can
- be combined
- be overridden
- have a default implementation
- be referenced using the ”&” selector
What is the difference
Interfaces doesn't cover all the features of traits from other languages You still can’t:
- prioritize different traits
- do other operations than union
So how does it look in Java?
interface HasName { Property<String> nameProperty(); default void setName(String name) { nameProperty().setValue(name); } default Optional<String> getName() { return Optional.ofNullable(nameProperty().getValue()); } }
So how does it look in Java?
interface HasName { Property<String> nameProperty(); default void setName(String name) { nameProperty().setValue(name); } default Optional<String> getName() { return Optional.ofNullable(nameProperty().getValue()); } }
So how does it look in Java?
interface HasName { Property<String> nameProperty(); default void setName(String name) { nameProperty().setValue(name); } default Optional<String> getName() { return Optional.ofNullable(nameProperty().getValue()); } }
So how does it look in Java?
interface HasName { Property<String> nameProperty(); default void setName(String name) { nameProperty().setValue(name); } default Optional<String> getName() { return Optional.ofNullable(nameProperty().getValue()); } }
Write decoupled systems
class Person implements HasName {...} class Pet implements HasName {...} class Friend { public void show(HasName a, HasName b) { System.out.println(a.getName().orElse("no one") + " is a friend of " + b.getName().orElse("no one") + "." ); } }
Write decoupled systems
class Person implements HasName {...} class Pet implements HasName {...} class Friend { public void show(HasName a, HasName b) { System.out.println(a.getName().orElse("no one") + " is a friend of " + b.getName().orElse("no one") + "." ); } }
Write decoupled systems
class Person implements HasName {...} class Pet implements HasName {...} class Friend { public void show(HasName a, HasName b) { System.out.println(a.getName().orElse("no one") + " is a friend of " + b.getName().orElse("no one") + "." ); } }
Write decoupled systems
class Person implements HasName {...} class Pet implements HasName {...} class Friend { public void show(HasName a, HasName b) { System.out.println(a.getName().orElse("no one") + " is a friend of " + b.getName().orElse("no one") + "." ); } }
Write decoupled systems
class Person implements HasName {...} class Pet implements HasName {...} class Friend { public void show(HasName a, HasName b) { System.out.println(a.getName().orElse("no one") + " is a friend of " + b.getName().orElse("no one") + "." ); } }
Combine multiple traits
<T extends HasName & HasFields> String printFields(T obj) { return obj.getName() + " has the following fields: " + obj.getFields().collect(joining("\n")) ; }
Combine multiple traits
<T extends HasName & HasFields> String printFields(T obj) { return obj.getName() + " has the following fields: " + obj.getFields().collect(joining("\n")) ; }
Combine multiple traits
<T extends HasName & HasFields> String printFields(T obj) { return obj.getName() + " has the following fields: " + obj.getFields().collect(joining("\n")) ; }
Combine multiple traits
<T extends HasName & HasFields> String printFields(T obj) { return obj.getName() + " has the following fields: " + obj.getFields().collect(joining("\n")) ; }
Comparison
More components to keep track of More components to write tests for Lower average file length Easier to reuse code when scaling out functionality Tests are more specific and might reveal more bugs
Real world example
Refactoring Speedment took about 6 months Maintainability measurements improved by 70% Testability measurements improved by 40% Complexity was reduced by 90%
Want to know more?
Interwebs: www.speedment.org Github: www.github.org/speedment Twitter: @Speedment @emifors