apache spark - How to cast Any Scala to Double -
i iterating dataframe , need apply custom code each row i'm doing
convertedpaths.foreach { row : row => cfmap = row.getvaluesmap(channelset.toseq)
assume channelset set of column names. i've declared cfmap of type [string, any] getvaluesmap (as understood return data type of column)
also, columns of long type trying :
channelset.foreach { key : string => var frequency = cfmap.get(key).get.asinstanceof[double] var value = c * frequency
given c variable of type double , value needs product of c , frequency gives me following error :
overloaded method value * alternatives: (x: double)double <and> (x: float)double <and> (x: long)double <and> (x: int)double <and> (x: char)double <and> (x: short)double <and> (x: byte)double cannot applied (any)
why asinstanceof[double] not correct solution , solution this?
although doesnt answer problem statement , i'm trying read on accumulators on how collect result major portion of logic can run inside convertedpaths.foreach loop.
i able resolve problem assigning
var cfmap = map[string, long]()
instead of
var cfmap = map[string, any]()
as know columns in convertedpaths long type
Comments
Post a Comment