Friday, November 13, 2015

Eclipse Error: archive required for library cannot be read

Recently I encountered a strange bug in eclipse after a colleague includes selenium in a pom file of a meven project. The eclipse complained " Error: archive required for library cannot be read". Detailed error like this:

"Description Resource Path Location Type Archive for required library: [root]/.m2/repository/org/seleniumhq/selenium/selenium-support/2.46.0/selenium-support-2.46.0.jar' in project '[*]' cannot be read or is not a valid ZIP file umom Build path Build Path Problem"

However, it was found that the selenium-support-2.46.0.jar is valid jar which can be open as zip.

After some searching, the following work around seems to work in my case (link: https://bugs.eclipse.org/bugs/show_bug.cgi?id=364653#c3):

Workaround: For each affected project set 'Incomplete build path' to 'Warning' on the Compiler > Building property page. After that, shut down and restart the eclipse. followed by update the maven projects. And the problem is gone

Thursday, November 12, 2015

Run mvn package with multiple modules containing independent pom files without public repository

Recently i need to build a java application which must be built with dependencies from several different modules, each having their own pom files that packages them into jar. As these files are not distributed from a public repository such as Maven Central and I do not reuse repository system such as Nexus in this case. The modules are in their own folders (with names such as "SK-Utils" "SK-Statistics" "SK-DOM" "OP-Core" "OP-Search" "ML-Core" "ML-Tune" "ML-Clustering" "ML-Trees"). What makes it complicated is that these modules actually have their dependencies specified on each other. E.g., ML-Core depends on "SK-DOM"and "SK-Utils" to build and run unit testing. Running these independent modules using IntelliJ IDE is ok. However, the modules failed to build when build using command lines such as "mvn package". Therefore i wrote a bash scripts which put in the same directory containing the independent modules. The bash script basically run "mvn package" using pom file in each module, then install them to the local repository. The bash script "mvn-build.sh" looks like the following:

#!/usr/bin/env bash
dirArray=( "SK-Utils" "SK-Statistics" "SK-DOM" "OP-Core" "OP-Search" "ML-Core" "ML-Tune" "ML-Clustering" "ML-Trees")
for dirName in "${dirArray[@]}"
do
 echo $dirName
 cd $dirName

 jarPath="target/$dirName-0.0.1-SNAPSHOT-jar-with-dependencies.jar"

 if [ -d $jarPath ]; then
     chmod 777 $jarPath
 fi

    mvn package




    mvn install:install-file -Dfile=$jarPath -DgroupId=com.meme -DartifactId=$dirName -Dpackaging=jar -Dversion=0.0.1-SNAPSHOT


    cd ..
done

Just run the above script using command such as "sudo ./mvn-build.sh" from its folder should build the multiple module project.

Note that each module should have a plugin like below specified so that the "jar-with-dependencies" jar will be generated.

<build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-install-plugin</artifactId>
                <version>2.5.2</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.6</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>

                <executions>
                    <execution>
                        <id>make-assembly</id> 
                        <phase>package</phase> 
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

One more note is that if you are running on environment such as centos linux and encounter "mvn: command not found" when executing "sudo ./mvn-build.sh", it may be due to the fact that the PATH environment variable not in the sudoer, in this case, just run

> sudo env "PATH=$PATH" ./mvn-build.sh

Saturday, November 7, 2015

Use HttpURLConnection to send a "GET" command to Elastic Search with a json body (enum "-d" option in curl's GET)

Recently I was working on implementing a java equivalence using HttpURLConnection to the following curl query which sends a "GET" command to elastic search with a json body specified in the "-d" option of curl command, something like the one below:

curl -XGET "http://127.0.0.1:9200/messages/_search?pretty" -d '
{
  "size" : 10,
  "query" : {
    "bool" : {
      "must" : [ {
        "match" : {
          "id" : {
            "query" : "[some id]",
            "type" : "boolean"
          }
        }
      },  {
        "nested" : {
          "query" : {
            "bool" : {
              "must" : {
                "match" : {
                  "agent" : {
                    "query" : "[some agent name]",
                    "type" : "boolean"
                  }
                }
              }
            }
          },
          "path" : "agents"
        }
      } ]
    }
  }
  }
}

The command requires a json body to be sent to the elastic search via the "GET" restful call. After some trial and error, I got this to work, below is the method implemented in Java.

 public static String httpGet(String urlToRead, String data) {
        URL url;
        HttpURLConnection conn;
        BufferedReader rd;
        String line;
        StringBuilder result = new StringBuilder();
        try {
            url = new URL(urlToRead);
            conn = (HttpURLConnection) url.openConnection();
            conn.setDoOutput(true);
            conn.setRequestMethod("GET");
            BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(conn.getOutputStream()));
            writer.write(data);
            writer.flush();

            rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
            while ((line = rd.readLine()) != null) {
                result.append(line);
            }
            rd.close();
        } catch (IOException e) {
            e.printStackTrace();
        } catch (Exception e) {
            e.printStackTrace();
        }
        return result.toString();
}
To call the above method and realize the curl query to ES above. just implement the following:

 
int size = 10;
String ipAddress = "127.0.0.1";
String url = "http://"+ipAddress+":9200/messages/_search?pretty";
String data = " { \"size\" : "+size+", \"query\" : { \"bool\" : { \"must\" : [ { \"match\" : { \"id\" : { \"query\" : \"[some id]\", \"type\" : \"boolean\" } } }, { \"nested\" : { \"query\" : { \"bool\" : { \"must\" : { \"match\" : { \"agent\" : { \"query\" : \"[some agent name]\", \"type\" : \"boolean\" } } } } }, \"path\" : \"agents\" } } ] } } } ";
String response = httpGet(url, data);

Tuesday, November 3, 2015

Create and Run Apache Spark Scala project using Eclipse and SBT

This post shows a simple way to create and run a apache spark scala project using eclipse and SBT. Following this post at link (http://czcodezone.blogspot.sg/2015/11/create-scala-project-using-scala-ide.html) to create a SBT-compatiable scala project in Scala IDE, open the build.sbt in the project and add the following line towards the end of the file.

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"

Further more, since Spark needs to run a spark cluster in a jar, a sbt plugin must be added for the packaging. Create a file named "assembly.sbt" in the "project" folder under the root directory and put the following content:

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")


Now in the command line, navigate to the project root directory and run "sbt compile", "sbt package" command after you put in your spark scala code, then "sbt run" or "spark-submit" depending on whether you want to run it locally or submit to spark cluster

Create a Scala project using Scala-IDE and SBT on CentOS

This post shows how to create and build a Scala project using  Scala IDE and SBT (Simple Build Tool)

Step 1: Create  a Scala project using Scala IDE


Download, unzip and launch the Scala IDE (link:  http://scala-ide.org/), use the IDE to create a Scala project (said, with a name "ScalaHelloWorld")

Step 2: Install SBT

Download and install SBT using the following command:

> curl https://bintray.com/sbt/rpm/rpm | sudo tee /etc/yum.repos.d/bintray-sbt-rpm.repo
> sudo yum install sbt

Step 3: Configure Scala project for SBT


After installing the SBT, navigate to your "ScalaHelloWorld" root directory and create a build.sbt file in the directory with the following content:

name := "File Searcher"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0" % "test"

Next in the root directory create a folder named "project" and create plugins.sbt file in the created "project" folder, with the following content:

addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0")


Now go back to the root directory and in the command line enter the following command:

> sbt eclipse

This will setup the eclipse scala project for sbt to work with. To begin writing Scala implementation, a simple recommendation with to create the folder structures like the following in the "src" folder under the root directory:

src/main/scala
src/main/resources
src/test/scala
src/test/resources

Now in the Scala IDE, right-click the folder "src/main/scala" and "src/test/scala" and select "Build Path -> Use as Source Folder", and just create your .scala file in these folders.

Step 4: Run SBT

After you code your scala implementation, navigate to the project root directory and type the following command to run the unit test codes:

> sbt test

And the following command to launch the app in main:

> sbt run

Wednesday, October 28, 2015

Unit Testing of AngularJS in Spring MVC maven project using Jasmine, PhantomJS, and Jenkins CI

This post is about some links on how to perform unit testing of angularjs in spring MVC project. One typical setup is with Jasmine and phantomjs.

Some general descriptions of these tools: jasmine is a behavior-driven development framework for testing javascript, phantomjs is a headless browser, which can be invoked on Jenkins CI to run unit testing of angularjs (Jenkins CI is a continuous integration server that supports building and testing of software projects)

POM setup


Firstly includes the following in your maven POM file

<properties>
<angularjs.version>1.4.3-1</angularjs.version>
<phantomjs.outputDir>${java.io.tmpdir}/phantomjs</phantomjs.outputDir>
</properties>

<build>
  <pluginManagement>
        <plugins>
            <!--This plugin's configuration is used to store Eclipse m2e settings 
                only. It has no influence on the Maven build itself. -->
            <plugin>
                <groupId>org.eclipse.m2e</groupId>
                <artifactId>lifecycle-mapping</artifactId>
                <version>1.0.0</version>
                <configuration>
                    <lifecycleMappingMetadata>
                        <pluginExecutions>
                            <pluginExecution>
                                <pluginExecutionFilter>
                                <groupId>com.github.klieber</groupId>
              <artifactId>phantomjs-maven-plugin</artifactId>
                                    <versionRange>
                                        [0.7,)
                                    </versionRange>
                                    <goals>
                                        <goal>install</goal>
                                    </goals>
                                </pluginExecutionFilter>
                                <action>
                                    <ignore></ignore>
                                </action>
                            </pluginExecution>
                        </pluginExecutions>
                    </lifecycleMappingMetadata>
                </configuration>
            </plugin>
        </plugins>
    </pluginManagement>

<plugins>

<plugin>
          <groupId>com.github.klieber</groupId>
          <artifactId>phantomjs-maven-plugin</artifactId>
          <version>0.7</version>
          <executions>
            <execution>
              <goals>
                <goal>install</goal>
              </goals>
            </execution>
          </executions>
          <configuration>
            <version>1.9.7</version>
          </configuration>
        </plugin>
   <plugin>
     <groupId>com.github.searls</groupId>
     <artifactId>jasmine-maven-plugin</artifactId>
     <version>2.0-alpha-01</version>
     <executions>
       <execution>
         <goals>
           <goal>test</goal>
         </goals>
       </execution>
     </executions>
     
  
     
     <configuration>
       <additionalContexts>
         <context>
           <contextRoot>/lib</contextRoot>
           <directory>${project.build.directory}/generated-resources/unit/ml/js</directory>
         </context>
       </additionalContexts>
       <skipTests>true</skipTests>
       <preloadSources>
          <source>/webjars/jquery/2.1.3/jquery.min.js</source>
     <source>/webjars/bootstrap/3.3.5/js/bootstrap.min.js</source>
     <source>/webjars/angularjs/${angularjs.version}/angular.min.js</source>
     <source>/webjars/angularjs/${angularjs.version}/angular-route.min.js</source>
     <source>/webjars/angularjs/${angularjs.version}/angular-animate.min.js</source>
    
      <source>/webjars/angularjs/${angularjs.version}/angular-mocks.js</source>
       </preloadSources>
       <jsSrcDir>${project.basedir}/src/main/resources/js</jsSrcDir>
       <jsTestSrcDir>${project.basedir}/src/test/resources/js</jsTestSrcDir>
       <webDriverClassName>org.openqa.selenium.phantomjs.PhantomJSDriver</webDriverClassName>
      <webDriverCapabilities>
     <capability>
      <name>phantomjs.binary.path</name>
      <value>${phantomjs.binary}</value>
     </capability>
    </webDriverCapabilities>
     </configuration>
   </plugin>
  </plugins>

  
<build>  


In the <plugins> section two plugins, namely jasmine and phantomjs maven plugins are added. The jasmine plugin is for jasmine to be used for unit testing and the phantomjs will download the phantomjs executable into a tmp folder so that it can be invoked to run the unit testing by Jenkins. The phantomjs maven plugin is very useful in that when the project is fetched by Jenkins CI to perform testing, the machine running Jenkins CI may not have phantomjs pre-installed. With the phantomjs maven plugin and the "install" goal specified in it, the Jenkins will search locally whether a copy of phantomjs is available in the specified phantomjs.outputDir folder, if not, it will download from the internet and put it in the phantomjs.outputDir folder, and after that the plugin set the phantomjs.binary parameter automatically, so that Jenkins CI knows where to find the phantomjs executable.

The org.eclipse.m2e lifecycle-mapping specified in the <pluginManagement> is used to stop Eclipse from complaining m2e cannot under the "install" goal specified in the phantomjs maven plugin. It does not have any effect on maven when it builds and run the project.

Implement Jasmine and spring unit testing codes


For this, there is already a nice article on how to do it here at:

https://spring.io/blog/2015/05/19/testing-an-angular-application-angular-js-and-spring-security-part-viii

Therefore I won't repeat it.

Tuesday, October 27, 2015

Building Asynchronous RESTful Services With Spring MVC and Guava

Recently I was working some asynchronous RESTful services in the Spring MVC framework in which i am thinking of using Guava to reduce the effort in development. After failing to find any good reference on using Guava with Spring MVC's asynchronous RESTful operation, I decide to do it by trial and error. In the end, it turned out to be quite easy. This post shows how to develop asynchronous RESTful services with spring mvc and google's Guava library.

Firstly includes the dependencies for Guava in your spring MVC project.

Implementation using Guava in Spring MVC


Now write a simple spring service something like the one below:


import com.google.common.util.concurrent.ListenableFuture;
import com.google.common.util.concurrent.ListeningExecutorService;
import com.google.common.util.concurrent.MoreExecutors;

import java.util.concurrent.Callable;
import java.util.concurrent.Executors;

import java.util.Date;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;

@Service
public class MyLongRunningServiceImpl extends MyLongRunningService {
   private ListeningExecutorService service;
 
   public MyLongRunningService(){
     service = MoreExecutors.listeningDecorator(Executors.newFixedThreadPool(10));
   }

   public ListenableFuture<Date> doLongRunningProcess(final int milliseconds){
 ListenableFuture<date> future = service.submit(new Callable<date>() {
            public Date call() throws Exception {
                Thread.sleep(milliseconds
                return new Date();
            }
        });

 return future;
   }
}

As shown in the code above the service has a method which delay a number user-specified milliseconds before returning a Date object. Next write a simple controller that autowires with the service:


import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.context.request.async.DeferredResult;


@Controller
public class MyLongRunningController {
  @Autowired
  private MyLongRunningService service;

  @RequestMapping(value="/longRun", method = RequestMethod.GET)
 public @ResponseBody DeferredResult<Date> doLongRunningProcess(){
  
  DeferredResult<Date> deferredResult = new DeferredResult<>();

   
  logger.info("start long running process");
  
  
  ListenableFuture<Date> future = service.doLongRunningProcess(60000);
  
  Futures.addCallback(future, new FutureCallback<Date>(){

   @Override
   public void onFailure(Throwable throwable) {
    logger.error("long running process failed", throwable);

   }

   @Override
   public void onSuccess(Date res) {
    logger.info("long running process returns");
    
     
    deferredResult.setResult(res);
   }
   
  });
  
  
  return deferredResult;
 }
}

This completes the coding part. There are two notes apart from the above implementation in order to make the asynchronous RESTful service to work:

<async-supported>true</async-supported>


The first thing is that the user needs to add in the following XML element into their web.xml configuration

<async-supported>true</async-supported>

This should be put under both <servlet> section (for the servlet which contains the spring controller above) as well as the <filter> section of the "org.springframework.web.filter.DelegatingFilterProxy"

asyncTimeout

If you are using Tomcat server as the servlet and http container for your spring MVC, that the asyncTimeout need to be added into the <Connector> element of the /conf/server.xml in tomcat directory, so that async services won't be terminated before tomcat's timeout, e.g.,

<Connector asyncTimeout="60000" ... >

If you are using javascript to interact with the asynchronous RESTful services, you may also need to specify the timeout property (e.g., in the $http of angularjs) so that the http call will not terminate before the asynchronous service call is completed.