@@ -257,49 +257,7 @@ val a: Tuple2<Int, Long> = tupleOf(1, 2L)
257257val b: Tuple3 <String , Double , Int > = t(" test" , 1.0 , 2 )
258258val c: Tuple3 <Float , String , Int > = 5f X " aaa" X 1
259259```
260- Tuples can be expanded and merged like this:
261- ``` kotlin
262- // expand
263- tupleOf(1 , 2 ).appendedBy(3 ) == tupleOf(1 , 2 , 3 )
264- tupleOf(1 , 2 ) + 3 == tupleOf(1 , 2 , 3 )
265- tupleOf(2 , 3 ).prependedBy(1 ) == tupleOf(1 , 2 , 3 )
266- 1 + tupleOf(2 , 3 ) == tupleOf(1 , 2 , 3 )
267-
268- // merge
269- tupleOf(1 , 2 ) concat tupleOf(3 , 4 ) == tupleOf(1 , 2 , 3 , 4 )
270- tupleOf(1 , 2 ) + tupleOf(3 , 4 ) == tupleOf(1 , 2 , 3 , 4 )
271-
272- // extend tuple instead of merging with it
273- tupleOf(1 , 2 ).appendedBy(tupleOf(3 , 4 )) == tupleOf(1 , 2 , tupleOf(3 , 4 ))
274- tupleOf(1 , 2 ) + tupleOf(tupleOf(3 , 4 )) == tupleOf(1 , 2 , tupleOf(3 , 4 ))
275- ```
276-
277- The concept of ` EmptyTuple ` from Scala 3 is also already present:
278- ``` kotlin
279- tupleOf(1 ).dropLast() == tupleOf() == emptyTuple()
280- ```
281-
282- Finally, all these tuple helper functions are also baked in:
283-
284- - ` componentX() ` for destructuring: ` val (a, b) = tuple `
285- - ` dropLast() / dropFirst() `
286- - ` contains(x) ` for ` if (x in tuple) { ... } `
287- - ` iterator() ` for ` for (x in tuple) { ... } `
288- - ` asIterable() `
289- - ` size `
290- - ` get(n) / get(i..j) ` for ` tuple[1] / tuple[i..j] `
291- - ` getOrNull(n) / getOrNull(i..j) `
292- - ` getAs<T>(n) / getAs<T>(i..j) `
293- - ` getAsOrNull<T>(n) / getAsOrNull<T>(i..j) `
294- - ` copy(_1 = ..., _5 = ...) `
295- - ` first() / last() `
296- - ` _1 ` , ` _6 ` etc. (instead of ` _1() ` , ` _6() ` )
297- - ` zip `
298- - ` dropN() / dropLastN() `
299- - ` takeN() / takeLastN() `
300- - ` splitAtN() `
301- - ` map `
302- - ` cast `
260+ To read more about tuples and all the added functions, refer to the [ wiki] ( https://github.com/JetBrains/kotlin-spark-api/wiki/Tuples ) .
303261
304262### Streaming
305263
@@ -342,6 +300,7 @@ withSparkStreaming(batchDuration = Durations.seconds(1), timeout = 10_000) { //
342300}
343301```
344302
303+ For more information, check the [ wiki] ( https://github.com/JetBrains/kotlin-spark-api/wiki/Streaming ) .
345304
346305## Examples
347306
0 commit comments