Source code repositories have branches, binary ones don't. So any part of your development plan that relies on branching can't rely on binary repositores for things that are in the scope of branches.
(One exception to the above is ClearCase, which does do both, and is best not spoken of.)
If the teams are to be independant, you really do need to manage the dependencies in terms of released versions. This will be painful; what you are doing is hard. Probably have to get used to planning and discussing changes in advance, abandoning ad-hoc refactoring.
On the other hand, perhaps the teams are really only nominal, e.g. everyone is all working together to a single overall product release/delivery schedule. Then just keep to a single overall version number that gets bumped any time a feature branch is merged back to master. All local dependencies then simply use {project.version}.
Note that this is still not a snapshot, as you never update it without changing the number.
And then within a team share code not binaries. Some extra recompilation, but if that takes more than 15 seconds, get faster PCs. And anyone not using a multi-module IDE like eclipse will have to write a trivial reactor-build POM to cover the set of modules they are currently working on.
See:
http://books.sonatype.com/m2eclipse-book/reference/eclipse.html
http://axelfontaine.com/blog/final-nail.html
http://ahoehma.wordpress.com/2010/12/22/intermodule-dependencies-now-better-working-with-maven-3/
For how to set up an IDE, CI build server and reactor build for this kind of situation.
Or, you could try reimplementing ClearCase with some scripting on top of git+Nexus (don't do this).
Alright, so I found a way which works for me, and it's automatable.
I found this snippet which allows you to create a Gradle task that will copy certain project dependencies to an arbitrary folder in Maven directory structure. I needed to tweak it slightly (link broken) so it doesn't just do that for one single configuration, but for any and all configurations the project defines.
My build script looks like this now:
apply plugin: 'java'
repositories {
maven {
url './dependencies-maven/'
}
mavenCentral()
jcenter()
// whatever other external repos your deps come from
}
dependencies {
// ...
}
task offlineRepo (type:OfflineMavenRepository) {
repoDir = new File(project.projectDir, 'dependencies-maven')
}
You then simply fire the offlineRepo
task once: all current project dependencies will be downloaded to Gradle's local cache and then copied into a subfolder in the project directory (in Maven repo structure). The next time you run the build with the project directory in this state, the just-populated local repo will be queried first, and since it contains (should contain) all the deps, nothing should be taken from the Gradle's user folder cache nor from the web.
At this point you can take the whole project folder to any other machine with just Gradle installed and it should be able to do the build without ever going online to fetch some additional data.
Whenever you want to update some project dependency or add new ones, you'll just need to fire the offlineRepo
task once again. Be aware that this won't delete the old version of the dependency you update, so if you don't want to end up with a cluttered repo, you need to do some manual cleaning here.
With this, plus putting the Gradle wrapper offline as well (described in my original question), I've now made my project buildable completely offline and with minimal prerequisites: no internet connectivity required, not even a Gradle install required. Merely the JDK.
Note: When I set this up in a completely empty dummy project, I had to create a java source directory and some sample java file, otherwise Gradle would only download/copy some POMs but not the corresponding JARs.
Addition: The tweaked snippet will only take care of project dependencies. But if you use a custom plugin, which is a buildscript dependency, the script will miss it. You'll need to add more code to the build()
method:
for(Configuration configuration : project.buildscript.configurations.findAll())
{
copyJars(configuration)
copyPoms(configuration)
}
Edit, December 2019: Three years later after a good bunch of upvotes over the whole time and now an explicit request, it appears this topic is still important not only to me.
To avoid link breakage, I will post the current version of my project skeleton setup right here.
The following assumes you are starting with a blank slate / empty folder that is supposed to be your future project directory. All paths I specify here are relative to the project root folder.
- Get a
gradle-wrapper.jar
and place it in /gradle/wrapper
. Place the accompanying gradlew
and gradlew.bat
run scripts in the project's root folder. Either grab those directly if you have them around somewhere, or install Gradle on your local system and run gradle wrapper
in your project's root folder to generate these files. (You do not need to install Gradle locally if you can get the wrapper jar and scripts from somewhere else, and even if not, this is a one-time job. Once the wrapper is in place, there is no more need for a local Gradle installation.)
- Grab a Gradle binary distribution package (file name like
gradle-X.X-bin.zip
) from somewhere and place it in /gradle/wrapper/
.
Create (or modify an existing) /gradle/wrapper/gradle-wrapper.properties
to contain the following:
distributionBase=PROJECT
distributionPath=gradle/wrapper/dists
zipStoreBase=PROJECT
zipStorePath=gradle/wrapper/dists
distributionUrl=./gradle-X.X-bin.zip
(The file name specified in distributionUrl
obviously is just a placeholder and has to match the actual filename or the binary distribution package you used in step 2.) (You can alternatively set both distributionBase
and zipStoreBase
to GRADLE_USER_HOME
to avoid having those temp files created in your project's folder; they will be created in your home folder instead.)
Make sure you exclude certain temporary files and directories from version control. I myself am using Git, so my example ignore-file is for Git. Adjust as needed if you are using a different version control system:
.gradle/
gradle/wrapper/dists/
build/
buildSrc/.gradle
buildSrc/build/
Add the file /buildSrc/src/main/groovy/buildutils/OfflineMavenRepository.groovy
with following contents (this is bmuschko's code linked above with all my proposed modifications applied to it):
package buildutils
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.Optional
import org.gradle.api.tasks.OutputDirectory
import org.gradle.api.tasks.TaskAction
import org.gradle.api.DefaultTask
import org.gradle.util.GFileUtils
import org.gradle.api.artifacts.Configuration
import org.gradle.api.artifacts.component.ModuleComponentIdentifier
import org.gradle.maven.MavenModule
import org.gradle.maven.MavenPomArtifact
class OfflineMavenRepository extends DefaultTask {
@OutputDirectory
File repoDir = new File(project.projectDir, 'dependencies/maven')
@TaskAction
void build() {
// Plugin/Buildscript dependencies
for(Configuration configuration : project.buildscript.configurations.findAll())
{
copyJars(configuration)
copyPoms(configuration)
}
// Normal dependencies
for(Configuration configuration : project.configurations.findAll())
{
copyJars(configuration)
copyPoms(configuration)
}
}
private void copyJars(Configuration configuration) {
configuration.resolvedConfiguration.resolvedArtifacts.each { artifact ->
def moduleVersionId = artifact.moduleVersion.id
File moduleDir = new File(repoDir, "${moduleVersionId.group.replace('.','/')}/${moduleVersionId.name}/${moduleVersionId.version}")
GFileUtils.mkdirs(moduleDir)
GFileUtils.copyFile(artifact.file, new File(moduleDir, artifact.file.name))
}
}
private void copyPoms(Configuration configuration) {
def componentIds = configuration.incoming.resolutionResult.allDependencies.collect { it.selected.id }
def result = project.dependencies.createArtifactResolutionQuery()
.forComponents(componentIds)
.withArtifacts(MavenModule, MavenPomArtifact)
.execute()
for(component in result.resolvedComponents) {
def componentId = component.id
if(componentId instanceof ModuleComponentIdentifier) {
File moduleDir = new File(repoDir, "${componentId.group.replace('.','/')}/${componentId.module}/${componentId.version}")
GFileUtils.mkdirs(moduleDir)
File pomFile = component.getArtifacts(MavenPomArtifact)[0].file
GFileUtils.copyFile(pomFile, new File(moduleDir, pomFile.name))
}
}
}
}
Create your /build.gradle
skeleton as follows:
repositories {
maven {
url './dependencies/'
}
mavenLocal()
mavenCentral()
jcenter()
}
buildscript {
repositories {
maven {
url './dependencies/'
}
mavenLocal()
mavenCentral()
jcenter()
}
}
import buildutils.OfflineMavenRepository
task offlineDependencies (type:OfflineMavenRepository) {
repoDir = new File(project.projectDir, 'dependencies/')
}
Add your additional online dependency repos, dependencies and other build logic to your build.gradle
, then run ./gradlew offlineDependencies
. This will create a folder structure within /dependencies/
and download and place all the dependency JARs inside. You only have to do this once to get the files there; once they are in place, you will only need to run this command again if your project's dependencies change (i.e. additional dependency or version change).
- Check everything in version control and you should be good. Only requirement for building the project is a working Java JDK on your system. Gradle is contained within your project's directory (usable via the wrapper), so are all your Maven dependencies.
Best Answer
If I understand your situation clearly, it seems like you have many A type Maven artifacts, and these mostly all define a dependency to a few B type artifacts.
The A type artifacts are product of your development Group. The B type artifacts that A depends on are from an external Group. Your runtime environments for A type and B type Artifacts will have bundled into them the B Artifacts already.
The problem is that when the Version of a B Artifact changes on the server, you do not want to be hassled with manually updating many POM files with all of the dependency changes that happened in B initially.
Your first suggestion I think is the best approach:
You can absolutely do this and it is an accepted practice. What you do is define a parent POM file that declares all of the dependencies an A type Artifact could need. This parent POM file can be deployed to your Maven repository as it's own unique Group, Artifact and Versioned module. This parent POM module can then be referenced in your A type projects POM files using the
<parent>
element. The benefit is that you can update all of the B type plugin versions in the parent project and reversion just the parent POM. Now your A type POM's only need to update their version number if they are to get the new dependencies. For more information on Parent POM projects see below.https://stackoverflow.com/questions/14400642/maven-parent-pom
But you see this isn't actually a bad thing!
Using the provided scope on a dependency to a B plugin is basically telling Maven that it is ONLY going to use this Artifact for compiling sources and unit testing. It will ignore the dependency specifically during packaging of the build artifact with the assumption that the JDK or the application container will provide this for the plugin during runtime.
http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html
If some of the dependencies aren't being used it won't matter. They won't be bundled, they are only downloaded and used to compile and run test cases.