Skip to content

Commit

Permalink
Deploying to gh-pages from @ f701082 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
alfonsorr committed Feb 27, 2024
1 parent fa51705 commit 49f0357
Show file tree
Hide file tree
Showing 4 changed files with 11 additions and 11 deletions.
4 changes: 2 additions & 2 deletions docs/exclusive.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ val sparkCol = f.expr("array_sort(value, (l, r) -> case " +

val doricCol = colArray[Row]("value").sortBy(CName("name"), CNameOrd("age", Desc))
// doricCol: ArrayColumn[Row] = TransformationDoricColumn(
// Kleisli(scala.Function1$$Lambda$3002/0x0000000801343840@2be41d90)
// Kleisli(scala.Function1$$Lambda$3003/0x0000000801343840@3ea4712b)
// )

dfArrayStruct.select(sparkCol.as("sorted")).show(false)
Expand Down Expand Up @@ -151,7 +151,7 @@ val mapColDoric = colString("value").matches[String]
.caseW(_.length > 4, "error key".lit)
.otherwiseNull
// mapColDoric: DoricColumn[String] = TransformationDoricColumn(
// Kleisli(scala.Function1$$Lambda$3002/0x0000000801343840@3a7c75d9)
// Kleisli(scala.Function1$$Lambda$3003/0x0000000801343840@569b57c4)
// )

dfMatch.withColumn("mapResult", mapColDoric).show()
Expand Down
8 changes: 4 additions & 4 deletions docs/implicits.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ val complexCol: DoricColumn[Int] =
.transform(_ + 1.lit)
.aggregate(0.lit)(_ + _)
// complexCol: DoricColumn[Int] = TransformationDoricColumn(
// Kleisli(scala.Function1$$Lambda$3002/0x0000000801343840@5234f6c5)
// Kleisli(scala.Function1$$Lambda$3003/0x0000000801343840@366558a2)
// )

dfArrays.select(complexCol as "complexTransformation").show()
Expand Down Expand Up @@ -277,7 +277,7 @@ The default doric syntax is a little stricter and forces us to transform these v
```scala
val colD = colInt("int") + 1.lit
// colD: DoricColumn[Int] = TransformationDoricColumn(
// Kleisli(scala.Function1$$Lambda$3002/0x0000000801343840@51787bc1)
// Kleisli(scala.Function1$$Lambda$3003/0x0000000801343840@7bc0e96a)
// )

intDF.select(colD).show()
Expand All @@ -298,11 +298,11 @@ we have to _explicitly_ add the following import statement:
import doric.implicitConversions.literalConversion
val colSugarD = colInt("int") + 1
// colSugarD: DoricColumn[Int] = TransformationDoricColumn(
// Kleisli(scala.Function1$$Lambda$3002/0x0000000801343840@734bb821)
// Kleisli(scala.Function1$$Lambda$3003/0x0000000801343840@3368136a)
// )
val columConcatLiterals = concat("this", "is","doric") // concat expects DoricColumn[String] values, the conversion puts them as expected
// columConcatLiterals: StringColumn = TransformationDoricColumn(
// Kleisli(scala.Function1$$Lambda$3002/0x0000000801343840@7bc0e96a)
// Kleisli(scala.Function1$$Lambda$3003/0x0000000801343840@21dcd0b5)
// )

intDF.select(colSugarD, columConcatLiterals).show()
Expand Down
8 changes: 4 additions & 4 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ _Maven_
Doric is committed to use the most modern APIs first.
<!-- * Doric is compatible with Spark version 3.4.0. -->
* The latest stable version of doric is 0.0.7.
* The latest experimental version of doric is 0.0.0+1-c8febe50-SNAPSHOT.
* The latest experimental version of doric is 0.0.0+1-f7010827-SNAPSHOT.
* Doric is compatible with the following Spark versions:

| Spark | Scala | Tested | doric |
Expand Down Expand Up @@ -85,7 +85,7 @@ It's only when we try to construct the DataFrame that an exception is raised at
```scala
df
// org.apache.spark.sql.AnalysisException: [DATATYPE_MISMATCH.BINARY_OP_DIFF_TYPES] Cannot resolve "(value * true)" due to data type mismatch: the left and right operands of the binary operator have incompatible types ("INT" and "BOOLEAN").;
// 'Project [unresolvedalias((value#365 * true), Some(org.apache.spark.sql.Column$$Lambda$5128/0x0000000801bca840@73d193ab))]
// 'Project [unresolvedalias((value#365 * true), Some(org.apache.spark.sql.Column$$Lambda$5129/0x0000000801bca840@2cb6e497))]
// +- LocalRelation [value#365]
//
// at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.dataTypeMismatch(package.scala:73)
Expand Down Expand Up @@ -182,7 +182,7 @@ strDf.select(f.col("str").asDoric[String]).show()
strDf.select((f.col("str") + f.lit(true)).asDoric[String]).show()
// doric.sem.DoricMultiError: Found 1 error in select
// [DATATYPE_MISMATCH.BINARY_OP_DIFF_TYPES] Cannot resolve "(str + true)" due to data type mismatch: the left and right operands of the binary operator have incompatible types ("DOUBLE" and "BOOLEAN").;
// 'Project [unresolvedalias((cast(str#378 as double) + true), Some(org.apache.spark.sql.Column$$Lambda$5128/0x0000000801bca840@73d193ab))]
// 'Project [unresolvedalias((cast(str#378 as double) + true), Some(org.apache.spark.sql.Column$$Lambda$5129/0x0000000801bca840@2cb6e497))]
// +- Project [value#375 AS str#378]
// +- LocalRelation [value#375]
//
Expand All @@ -196,7 +196,7 @@ strDf.select((f.col("str") + f.lit(true)).asDoric[String]).show()
// at repl.MdocSession$MdocApp$$anonfun$2.apply(quickstart.md:76)
// at repl.MdocSession$MdocApp$$anonfun$2.apply(quickstart.md:76)
// Caused by: org.apache.spark.sql.AnalysisException: [DATATYPE_MISMATCH.BINARY_OP_DIFF_TYPES] Cannot resolve "(str + true)" due to data type mismatch: the left and right operands of the binary operator have incompatible types ("DOUBLE" and "BOOLEAN").;
// 'Project [unresolvedalias((cast(str#378 as double) + true), Some(org.apache.spark.sql.Column$$Lambda$5128/0x0000000801bca840@73d193ab))]
// 'Project [unresolvedalias((cast(str#378 as double) + true), Some(org.apache.spark.sql.Column$$Lambda$5129/0x0000000801bca840@2cb6e497))]
// +- Project [value#375 AS str#378]
// +- LocalRelation [value#375]
//
Expand Down
2 changes: 1 addition & 1 deletion docs/validations.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ raising a run-time exception:
// Spark
List(1,2,3).toDF().select(f.col("id")+1)
// org.apache.spark.sql.AnalysisException: [UNRESOLVED_COLUMN.WITH_SUGGESTION] A column or function parameter with name `id` cannot be resolved. Did you mean one of the following? [`value`].;
// 'Project [unresolvedalias(('id + 1), Some(org.apache.spark.sql.Column$$Lambda$5128/0x0000000801bca840@73d193ab))]
// 'Project [unresolvedalias(('id + 1), Some(org.apache.spark.sql.Column$$Lambda$5129/0x0000000801bca840@2cb6e497))]
// +- LocalRelation [value#399]
//
// at org.apache.spark.sql.errors.QueryCompilationErrors$.unresolvedAttributeError(QueryCompilationErrors.scala:221)
Expand Down

0 comments on commit 49f0357

Please sign in to comment.