Add dayofyear, weekofyear, month, dayofmonth, minute, second, next_da…#268
Add dayofyear, weekofyear, month, dayofmonth, minute, second, next_da…#268Avasil wants to merge 6 commits intotypelevel:masterfrom
Conversation
| * | ||
| * apache/spark | ||
| */ | ||
| def next_day[T](date: AbstractTypedColumn[T, String], dayOfWeek: String): date.ThisType[T, Option[java.sql.Date]] = |
There was a problem hiding this comment.
Right now it doesn't compile.
In Spark it returns java.sql.Date, I'm not sure whether I should add TypedEncoder for that or use something else.
There was a problem hiding this comment.
I don't see any issue with an encoder for java.sql.Date, if this is what's return in vanilla we can simply follow.
|
@Avasil Sorry for taking so long to have a look at your PR... The diff looks pretty good at a quick glance, do you need help on anything to get the CI green before the review? |
|
Thanks and no problem @OlivierBlanvillain I was a bit busy lately too - I think I'm good, If I have any issues with |
|
@OlivierBlanvillain Hmm, now it fails the test related to #205: test("#205: comparing literals encoded using Injection") {
import org.apache.spark.sql.catalyst.util.DateTimeUtils
implicit val dateAsInt: Injection[java.sql.Date, Int] =
Injection(DateTimeUtils.fromJavaDate, DateTimeUtils.toJavaDate)
val today = new java.sql.Date(System.currentTimeMillis)
val data = Vector(P(42, today))
val tds = TypedDataset.create(data)
tds.filter(tds('d) === today).collect().run()
}
}
final case class P(i: Int, d: java.sql.Date)I'ts failing with: Any tips how to debug stuff like that? I need to somehow figure out why Spark tries to generate code this way. :D |
# Conflicts: # dataset/src/main/scala/frameless/functions/NonAggregateFunctions.scala # dataset/src/test/scala/frameless/functions/NonAggregateFunctionsTests.scala
|
@OlivierBlanvillain @imarios Any ideas how to proceed? :) IIRC many other column functions could use |
…y Column functions
Related to #164