Skip to content

Commit

Permalink
Update README for 0.11.0 (#507)
Browse files Browse the repository at this point in the history
  • Loading branch information
srowen authored Dec 7, 2020
1 parent 2b4aca0 commit 74b9802
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,15 +26,15 @@ You can link against this library in your program at the following coordinates:
```
groupId: com.databricks
artifactId: spark-xml_2.11
version: 0.10.0
version: 0.11.0
```

### Scala 2.12

```
groupId: com.databricks
artifactId: spark-xml_2.12
version: 0.10.0
version: 0.11.0
```

## Using with Spark shell
Expand All @@ -43,12 +43,12 @@ This package can be added to Spark using the `--packages` command line option. F

### Spark compiled with Scala 2.11
```
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.10.0
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.11.0
```

### Spark compiled with Scala 2.12
```
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.10.0
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.11.0
```

## Features
Expand Down Expand Up @@ -409,7 +409,7 @@ Automatically infer schema (data types)
```R
library(SparkR)

sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.10.0"))
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.11.0"))

df <- read.df("books.xml", source = "xml", rowTag = "book")

Expand All @@ -421,7 +421,7 @@ You can manually specify schema:
```R
library(SparkR)

sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.10.0"))
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.11.0"))
customSchema <- structType(
structField("_id", "string"),
structField("author", "string"),
Expand Down

0 comments on commit 74b9802

Please sign in to comment.