Category Archives: SBT

UDF overloading in spark


UDF are User Defined Function which are register with hive context to use custom functions in spark SQL queries. For example if you want to prepend some string in any other string or column then you can create a following UDF

def addSymbol(input:String, symbol:String)={

Now to register above function in hiveContext we need to register UDF as follows


Now you can use above udf in your sql query in Spark SQL as like below:

hiveContext.sql("select addSymbol('50000','$')").show

Now if you want to overload the above udf for another signature like if user call addSymbol function with single argument and we prepend default String, So now come in your mind is to create another function for addSymbol with single argument add register it with hiveContext like above. Okay once try and then come back you will get your answer.

Its works?

Answer is not, you were see there is…

View original post 153 more words

SBT-dependency tree


In this blog , I am going to describe how to view sbt dependency tree.  Last week I had a problem related to different cross version of a dependency. I knew the problem cause but  I had spent a day to know which dependency had brought that cross version of dependency. I did some study and browsing about that problem then I come across a sbt plugins as a potential solution of that problem.
In a project there is a chance of using same library but different version by multiple dependencies.  I also victim of dependency version conflict.  The good way is just draw a sbt dependency tree. Here is a sbt-plugin sbt-dependency graph is available for that.

Following are the steps to install and use sbt-dependency-graph
a) add plugin to project/plugins.sbt

b) add sbt setting in build.sbt

if project is multi module then add to Parent.scala:

Now run sbt…

View original post 14 more words