IntelliJ, Groovy, Jenkins, and unresolved access - jenkins

I am building a shared library that is used to run pipeline on jenkins.
Frequently I used things like:
def call(String stage_name = "Generate installer") {
//noinspection GrUnresolvedAccess
stage(stage_name) {
...
}
}
stage is a Jenkins steps. It generate an "unresolved access" warning in IntelliJ because it's not declared everywhere. Hopefully it does not generate an error as if it was a missing class import.
However I do wonder if there's a better solution than putting suppress warning declarations. Is there any way to indicate to IntelliJ the existence of the Jenkins steps ?

Related

Gradle: Add dependency from Java to Native compilation

I try to setup gradle for a proper JNI compilation, so I need to build first a shared library (with the c plugin), and then compile and test the java code (which consumes the library).
Here a sample of the build.gradle, related to the native compilation:
model {
components {
yli(NativeLibrarySpec) {
sources {
c {
source {
srcDir 'src/main/c'
include "Yli.c"
commonFolders.each {
include "$it/**/*.c"
}
}
}
}
buildTypes {
release
}
}
}
}
What is the best way to tell gradle that the compileJava should wait for the build of the NativeLibrarySpec?
Edit: When I try to add
compileJava.dependsOn(yliSharedLibrary)
I have the following error during gradle build:
* What went wrong:
A problem occurred evaluating root project 'yli'.
> Could not get unknown property 'sharedLibrary' for root project 'yli' of type org.gradle.api.Project.
Note: I used the command 'gradle tasks' in order to found the name of the task: 'yliSharedLibrary'.
I played around with this and discovered that you can access the tasks created by the software model within closures. For example, if you want to depend on one of the native tasks, you can do so with:
compileJava.dependsOn { yliNativeCompileTask }
Of course, if you want the Java task to come after the native one, but not force an actual dependency between them, you can use mustRunAfter():
compileJava.mustRunAfter { yliNativeCompileTask }
This syntax also works for declared inputs and outputs:
compileJava.inputs.files { yliNativeCompileTask }
Note that if you tie the inputs of a task to the outputs of another task, you don't have to explicitly declare a dependsOn. Gradle infers the task dependency.
Disclaimer I don't know if this is the correct way to do this, or how far you can take this approach.
One final thing: the old native software model is being replaced by a new set of native plugins based on Gradle's original model. It should be much easier to integrate Java projects with these new plugins, but you may want to wait until the plugins have been fully fleshed out before attempting a migration.

Automaticity resolving dependency for code analysis

I have simple scala code with Spoon library:
class ExtractCodeDataTest extends FlatSpec {
it should "Run and not be empty" in{
val l = new Launcher()
l.addInputResource("./testData/owasp-security-logging")
l.buildModel()
val factory = l.getFactory
val allClass = factory.Class().getAll(true)
println(allClass)
}
}
I had cloned open source project from github but I can't compile it.
The import org.junit cannot be resolved at /home/user/IdeaProjects/testSearch/testData/owasp-security-logging/owasp-security-logging-logback/src/test/java/org/owasp/security/logging/filter/SecurityMarkerFilterTest.java:3
spoon.compiler.ModelBuildingException: The import org.junit cannot be resolved at /home/user/IdeaProjects/testSearch/testData/owasp-security-logging/owasp-security-logging-logback/src/test/java/org/owasp/security/logging/filter/SecurityMarkerFilterTest.java:3
at spoon.support.compiler.jdt.JDTBasedSpoonCompiler.report(JDTBasedSpoonCompiler.java:583)
at spoon.support.compiler.jdt.JDTBasedSpoonCompiler.reportProblems(JDTBasedSpoonCompiler.java:564)
at spoon.support.compiler.jdt.JDTBasedSpoonCompiler.build(JDTBasedSpoonCompiler.java:120)
at spoon.support.compiler.jdt.JDTBasedSpoonCompiler.build(JDTBasedSpoonCompiler.java:101)
I can't find any way for automaticity resolving dependency? How fix them for general projects- not only this particular one?
Spoon can be used in two different modes:
with full classpath
in "no-classpath mode"
With the first mode, you have to provide to Spoon the whole classpath necessary to analyze the project for example in using the arguments --source-classpath in command line, or by using launcher.getEnvironment().setSourceClasspath(String[]). In this mode you'll have a maximum of information to analyze your code.
With the second mode, Spoon will only analyze the given source code without exploiting information from libraries. You won't be able to get all the information of classes from external libraries, or to compile the code but still you can analyze the code source of the project. You can use this mode by setting launcher.getEnvironement().setNoClasspath(true).
Please note that an issue is opened on Spoon to be able to automatically analyze a Maven project, in considering the whole dependencies given in the pom.xml, see: https://github.com/INRIA/spoon/issues/1396.

How to use methods from a global external java library in a Groovy Jenkins Pipeline?

First, I´m new to Java, Groovy and Jenkins so please be patient with me ;)
I´m preparing a Jenkins server with Pipeline support for future use in our build environment.
We use a special inhouse scripting language for which i have to write a wrapper in java. There is no option to do the work only in Groovy, we have to use this special language.
I have tried many methods of referencing the java lib to this jenkins project but neither worked.
Mainly i´ve used the documentation on https://github.com/jenkinsci/workflow-cps-global-lib-plugin to implement this but also tried several approaches searching google or stackoverflow. Following the documentation, this include should be possible.
I´ve reduced the process to a test setup for testing purposes.
Assume the following...
I have a multibranch project in Jenkins named 'MultibranchTestProject01'.
The Jenkinsfile:
#Library('DeltaJenkinsScripts#develop')
def runStageCollect = true
if (runStageCollect)
{
stage("Collect")
{
helloWorld("Joe")
}
}
The referenced library is referenced globally via 'Global Pipeline Libraries' in the Jenkins settings but also explicitly here to clarify things.
It´s hosted in a git environment and the referencing seems to work.
The file structure of this library:
/vars/helloWorld.groovy
package de.dcomp.prod
def call(name) {
def tt = new Test()
tt.testText()
}
/src/de/dcomp/prod/Test.groovy
package de.dcomp.prod
import de.dcomp.ftel.*
def testText()
{
def sRetVal = ""
echo "testText - START"
//sRetVal = ScriptRunner.GetStaticSampleText()
def oSR = new ScriptRunner()
sRetVal = oSR.GetInstanceSampleText()
echo "ReturnValue: ${sRetVal}"
}
I have a java lib called ScriptRunner-0.0.1-SNAPSHOT.jar. This library has a single class:
package de.dcomp.ftel;
public class ScriptRunner
{
public String GetInstanceSampleText()
{
return "ScriptRunner.GetInstanceSampleText() called...";
}
public static String GetStaticSampleText()
{
return "ScriptRunner.GetStaticSampleText() called...";
}
}
I have no problem in referencing and using this library in a standalone java project.
I´ve tried several ways to include it:
Put the jar file to 'C:\Users\cr.groovy\lib'
Setting the Classpath in a testing linux environment.
Using the plugin "Pipeline: Classpath Steps" to add the library to the classpath in different notations, e.g. 'C:\Users\cr.groovy\lib', C:/Users/cr/.groovy/lib', 'C:\Users\cr.groovy\lib\ScriptRunner-0.0.1-SNAPSHOT.jar', 'C:/Users/cr/.groovy/lib/ScriptRunner-0.0.1-SNAPSHOT.jar', 'file:///C:/Users/cr/.groovy/lib/ScriptRunner-0.0.1-SNAPSHOT.jar'
adding the lib to a local maven repository and referencing per #GrabResolver and #Grab, though this is not the solution i would like to have
or dynamic loading with:
this.class.classLoader.rootLoader.addURL(new URL("file:///C:/Users/cr/.groovy/lib/ScriptRunner-0.0.1-SNAPSHOT.jar"));
def srClass = Class.forName("de.dcomp.ftel.ScriptRunner")
def sr = srClass.newInstance()
The result is always something like this.
groovy.lang.MissingPropertyException: No such property: ScriptRunner for class: de.dcomp.prod.Test
or this:
de/dcomp/prod/Test.groovy: 10: unable to resolve class ScriptRunner
# line 10, column 12.
def oSR = new ScriptRunner()
The error messages point always in the direction that the process cannot find the Java library. The same thing happened if i try to use some other library, e.g. from Apache Commons.
I would like to avoid writig it as a plugin if this is possible.
Thanks in advance!
The only method I've found so far that works was to run this in the pipeline to find out what directories are being checked:
println System.getProperty("java.ext.dirs")
And in my case, it was looking in
/usr/java/packages/lib/ext
So I put the jar I wanted to load in that location (after having to create the directory), and then restarted Jenkins.
Afterwards I was successfully able to do an import of the library and use it.
Seems very hacky and the sort of thing that might be considered a bug and removed without notice.
If you are using external library (#Library) in your pipeline, you can define grape dependencies via Grab. Example below from ciinabox-pipelines shared library. This will download jars and load them automatically in groovy script.
#Grab(group='com.amazonaws', module='aws-java-sdk-ec2', version='1.11.198')
import com.amazonaws.services.ec2.* import com.amazonaws.services.ec2.model.* import com.amazonaws.regions.*
What is important that code above probably won't work in pipeline itself, but when loaded as part of shared library, it should with latest plugin versions.

Eclipse fails where javac and IDEA succeed

Consider the following self-contained sample:
package bloopers;
import java.lang.annotation.Annotation;
public final class Blooper5
{
interface Converter<T,F>
{
T convert( F from );
}
interface Identifier<T>
{
}
static class ConvertingIdentifier<F,T> implements Identifier<F>
{
ConvertingIdentifier( Converter<T,F> converter )
{
}
}
static final class AnnotationIdentifier
{
Identifier<Annotation> I1 = new ConvertingIdentifier<>(
a -> a.annotationType() );
Identifier<Annotation> I2 = new ConvertingIdentifier<>(
Annotation::annotationType ); //<-- ERROR
Identifier<Annotation> I3 = new ConvertingIdentifier<>(
(Converter<Class<? extends Annotation>,Annotation>)
Annotation::annotationType );
}
}
The code above compiles just fine under the following:
javac from the command line.
IntelliJ IDEA configured to use the javac compiler.
But it fails to compile with the following:
Eclipse
IntelliJ IDEA configured to use the Eclipse compiler.
Eclipse fails to compile the line marked with <-- ERROR, giving the following message:
The constructor Blooper5.ConvertingIdentifier<Annotation,Class<capture#5-of ? extends Annotation>>(Blooper5.Converter<Class<? extends Annotation>,Annotation>) is undefined
Admittedly, this code really pushes the generic parameter type inference capabilities of the compiler, but still, I would like to know precisely what the discrepancy is, no matter how small.
Some exposure of my methods in case someone manages to see something wrong that I fail to see:
The command I used to compile with javac was "c:\Program Files\Java\jdk1.8.0_40\bin\javac" Blooper5.java.
I have version 14.1 of IntelliJ IDEA. Under Project Structure/SDKs I only have "1.8" which points to C:\Program Files\Java\jdk1.8.0_40 and under Project Structure/Modules the specific module is configured to use the "Project SDK (1.8)" which lists as 1.8 (java version "1.8.0_40").
As for Eclipse, I am using Eclipse for RCP and RAP Developers - Version: Luna Release (4.4.0) - Build id: 20140612-0600. Under Preferences/Java/Installed JREs I only have jdk1.8.0_40, and it is the default. Under Execution Environments it is also checked as a "Compatible JRE" of "JavaSE-1.8". And in my Project/Properties/Java Build Path/Libraries the "JRE System Library" is [jdk1.8.0_40].
More noteworthy facts:
It is not just me; it also fails on a colleague's (very similar) eclipse installation.
IntelliJ IDEA says that the lambda expression a -> a.annotationType() can be replaced with a method reference, but if asked to do so, it does not convert it to Annotation::annotationType; instead, it converts it to (Converter<Class<? extends Annotation>, Annotation>) Annotation:: annotationType.
So, the question:
What is causing these discrepancies between Eclipse and the others, and what can be done to eliminate these discrepancies?
(Obviously, the goal is to eliminate the unfortunately too frequently occurring scenario where one developer commits code which fails to compile on another developer's IDE.)
EDIT: When I originally posted this question I thought that IDEA using the Eclipse compiler also compiled fine, but I was wrong. It turns out that it is possible to get IDEA to fail to compile the above code by selecting the Eclipse compiler. Still, the question is why there is a discrepancy between eclipse and javac.
The answer to "why is there a discrepancy" is straightforward but perhaps not very satisfactory: because compilers have bugs and are furthermore open to interpretation of a very complex language specification. Determining whether it's a bug in javac or Eclipse is a difficult task; I've seen such discrepancies end up being declared both ways, sometimes as Eclipse compiler bugs, sometimes as javac bugs. That determination, especially when it involves generics and new language features (such as lambdas), can get quite tedious and arcane. For example, look at this one that turned out to be a javac bug but did uncover a related issue in Eclipse's compiler: https://bugs.eclipse.org/bugs/show_bug.cgi?id=456459
The best bet is to report it as an Eclipse bug as I did and see if the Eclipse compiler team can/will track it down.

Intellij indicating error erroneously for Akka code in Java

Here's the setup:
IntelliJ 13.1.2 (also tried 13.1.1)
Akka 2.3.2
Java 1.8 (also tried Java 1.7)
Scala library 2.11 (also tried 2.10)
Scala IntelliJ plugin 0.35.683
Gradle 1.11 (IntelliJ is using this same installation)
I'm attempting to write some Java code that does some Akka stuff -- create Actors, send messages around, etc. However, there are two pieces of code that throw type errors in IntelliJ but work just fine when compiling on the command line, and I'm at a loss as to how to resolve it.
Future#onSuccess()
I have a call like this:
Future<Iterable<Object>> sequence = Futures.sequence(...);
sequence.onSuccess(new PrintResult<Iterable<Object>>(), getContext().dispatcher());
where PrintResult is defined the same as here: http://doc.akka.io/docs/akka/2.3.2/java/futures.html (i.e. PrintResult extends OnSuccess)
However, I get a persistent red underline in IntelliJ on PrintResult:
onSuccess (scala.PartialFunction,java.lang.Object>, ExecutionContext) in Future cannot be applied
to (PrintResult>, ExecutionContextExecutor)
I have a method relating to logging that can be simplified down to this:
public void log(LoggingAdapter adapter, Logging.LogLevel level, String message) {
adapter.log(level, message);
}
IntelliJ doesn't complain. But if I try to build it on the command line, javac complains: error: incompatible types: LogLevel cannot be converted to int. Adding level.asInt() makes the command-line errors go away, but now IntelliJ complains that adapter.log(level, message) doesn't take int for a level.
Any ideas what could fix the issues and make IntelliJ stop complaining?
This looks like an IntelliJ Scala plugin bug, plain and simple. There are many bugs in the official bug tracker that match this general description, all "good code red" or similar. So I'll start there and perhaps open a bug if needed.

Resources