commit | 28253617678df423ea7011bb5fc5082cf6378ec8 | [log] [tgz] |
---|---|---|
author | Carl Mastrangelo <notcarl@google.com> | Mon May 16 10:23:53 2016 -0700 |
committer | Carl Mastrangelo <notcarl@google.com> | Mon May 16 10:25:03 2016 -0700 |
tree | 0fc99702a525bcf13bdf59d5d04306a7d7b2cd37 | |
parent | a70e1a4d8bfcee3c912394401fc0c8434af7dd2a [diff] |
stub: use correct javadoc link for Guava
gRPC-Java works with JDK 6. TLS usage typically requires using Java 8, or Play Services Dynamic Security Provider on Android. Please see the Security Readme.
Download the JARs. Or for Maven, add to your pom.xml
:
<dependency> <groupId>io.grpc</groupId> <artifactId>grpc-all</artifactId> <version>0.14.0</version> </dependency>
Or for Gradle, add to your dependencies:
compile 'io.grpc:grpc-all:0.14.0'
For Android client, you only need to depend on the needed sub-projects, such as:
compile 'io.grpc:grpc-okhttp:0.14.0' compile 'io.grpc:grpc-protobuf-nano:0.14.0' compile 'io.grpc:grpc-stub:0.14.0'
Development snapshots are available in Sonatypes's snapshot repository.
For protobuf-based codegen, you can put your proto files in the src/main/proto
and src/test/proto
directories along with an appropriate plugin.
For protobuf-based codegen integrated with the Maven build system, you can use protobuf-maven-plugin:
<build> <extensions> <extension> <groupId>kr.motd.maven</groupId> <artifactId>os-maven-plugin</artifactId> <version>1.4.1.Final</version> </extension> </extensions> <plugins> <plugin> <groupId>org.xolstice.maven.plugins</groupId> <artifactId>protobuf-maven-plugin</artifactId> <version>0.5.0</version> <configuration> <!-- The version of protoc must match protobuf-java. If you don't depend on protobuf-java directly, you will be transitively depending on the protobuf-java version that grpc depends on. --> <protocArtifact>com.google.protobuf:protoc:3.0.0-beta-2:exe:${os.detected.classifier}</protocArtifact> <pluginId>grpc-java</pluginId> <pluginArtifact>io.grpc:protoc-gen-grpc-java:0.14.0:exe:${os.detected.classifier}</pluginArtifact> </configuration> <executions> <execution> <goals> <goal>compile</goal> <goal>compile-custom</goal> </goals> </execution> </executions> </plugin> </plugins> </build>
For protobuf-based codegen integrated with the Gradle build system, you can use protobuf-gradle-plugin:
apply plugin: 'java' apply plugin: 'com.google.protobuf' buildscript { repositories { mavenCentral() } dependencies { // ASSUMES GRADLE 2.12 OR HIGHER. Use plugin version 0.7.5 with earlier // gradle versions classpath 'com.google.protobuf:protobuf-gradle-plugin:0.7.7' } } protobuf { protoc { // The version of protoc must match protobuf-java. If you don't depend on // protobuf-java directly, you will be transitively depending on the // protobuf-java version that grpc depends on. artifact = "com.google.protobuf:protoc:3.0.0-beta-2" } plugins { grpc { artifact = 'io.grpc:protoc-gen-grpc-java:0.14.0' } } generateProtoTasks { all()*.plugins { grpc {} } } }
If you are making changes to gRPC-Java, see the compiling instructions.
Here's a quick readers' guide to the code to help folks get started. At a high level there are three distinct layers to the library: Stub, Channel & Transport.
The Stub layer is what is exposed to most developers and provides type-safe bindings to whatever datamodel/IDL/interface you are adapting. gRPC comes with a plugin to the protocol-buffers compiler that generates Stub interfaces out of .proto
files, but bindings to other datamodel/IDL should be trivial to add and are welcome.
The Channel layer is an abstraction over Transport handling that is suitable for interception/decoration and exposes more behavior to the application than the Stub layer. It is intended to be easy for application frameworks to use this layer to address cross-cutting concerns such as logging, monitoring, auth etc. Flow-control is also exposed at this layer to allow more sophisticated applications to interact with it directly.
The Transport layer does the heavy lifting of putting and taking bytes off the wire. The interfaces to it are abstract just enough to allow plugging in of different implementations. Transports are modeled as Stream
factories. The variation in interface between a server Stream and a client Stream exists to codify their differing semantics for cancellation and error reporting.
Note the transport layer API is considered internal to gRPC and has weaker API guarantees than the core API under package io.grpc
.
gRPC comes with three Transport implementations:
Tests showing how these layers are composed to execute calls using protobuf messages can be found here https://github.com/google/grpc-java/tree/master/interop-testing/src/main/java/io/grpc/testing/integration