Искра под Виндой
#39816914
Ссылка:
Ссылка на сообщение:
Ссылка с названием темы:
Ссылка на профиль пользователя:
|
Участник
Откуда: loopback
Сообщения: 53 422
|
|
Хм. Под линуксом результат немного другой. (Файловые пути и сплиттер я поменял.)
1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21.
val conf = new SparkConf().setMaster("local").setAppName("My App")
val sc = new SparkContext(conf)
printf("[::1]")
val input = sc.textFile("/db/GEO/maxmind/2010-10.MaxMind GeoIP City CSV Format/GeoIP-139_20101001/GeoIPCity.csv")
printf("[::2]")
val words = input.flatMap(line => line.split(","))
printf("[::3]")
val counts = words.map(words => (words, 1)).reduceByKey( { case (x,y) => x + y } )
printf("[::4]")
counts.saveAsTextFile("/db/GEO/maxmind/2010-10.MaxMind GeoIP City CSV Format/GeoIP-139_20101001/GeoIPCity-report")
printf("OK")
1. 2. 3. 4. 5. 6. 7.
name := "probeSpark"
version := "0.1"
scalaVersion := "2.12.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.3"
Сформировался не 1 отчот а пачка. ХМ... корректно ли это? Не знаю.
Складывается ощущение что последний шаг слияния отчота не добежал до конца.
В логах ошибки были но они не прерывали процесс расчета.
1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. 52. 53. 54. 55. 56. 57. 58. 59. 60. 61. 62. 63. 64. 65. 66. 67. 68. 69. 70. 71. 72. 73. 74. 75. 76. 77. 78. 79. 80. 81. 82. 83. 84. 85. 86. 87. 88. 89. 90. 91. 92. 93. 94. 95. 96. 97. 98. 99. 100. 101. 102. 103. 104. 105. 106. 107. 108. 109. 110. 111. 112. 113. 114. 115. 116. 117. 118. 119. 120. 121. 122. 123. 124. 125. 126. 127. 128. 129. 130. 131. 132. 133. 134. 135. 136. 137. 138. 139. 140. 141. 142. 143. 144. 145. 146. 147. 148. 149. 150. 151. 152. 153. 154. 155. 156. 157. 158. 159. 160. 161. 162. 163.
[info] Loading project definition from /home/mayton/git/probeSpark/project
[info] Loading settings for project probespark from build.sbt ...
[info] Set current project to probeSpark (in build file:/home/mayton/git/probeSpark/)
[info] Running Main
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/23 09:56:54 WARN Utils: Your hostname, mayton-ryzen resolves to a loopback address: 127.0.1.1; using 192.168.0.110 instead (on interface enp7s0)
19/05/23 09:56:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/tmp/sbt_e76d0bca/target/f8e12f96/spark-unsafe_2.12-2.4.3.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
19/05/23 09:56:54 INFO SparkContext: Running Spark version 2.4.3
19/05/23 09:56:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/05/23 09:56:54 INFO SparkContext: Submitted application: My App
19/05/23 09:56:54 INFO SecurityManager: Changing view acls to: mayton
19/05/23 09:56:54 INFO SecurityManager: Changing modify acls to: mayton
19/05/23 09:56:54 INFO SecurityManager: Changing view acls groups to:
19/05/23 09:56:54 INFO SecurityManager: Changing modify acls groups to:
19/05/23 09:56:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mayton); groups with view permissions: Set(); users with modify permissions: Set(mayton); groups with modify permissions: Set()
19/05/23 09:56:55 INFO Utils: Successfully started service 'sparkDriver' on port 46127.
19/05/23 09:56:55 INFO SparkEnv: Registering MapOutputTracker
19/05/23 09:56:55 INFO SparkEnv: Registering BlockManagerMaster
19/05/23 09:56:55 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/23 09:56:55 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/23 09:56:55 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-1779fe36-c630-405d-ae8e-52ce5a31db79
19/05/23 09:56:55 INFO MemoryStore: MemoryStore started with capacity 408.9 MB
19/05/23 09:56:55 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/23 09:56:55 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
19/05/23 09:56:55 INFO Utils: Successfully started service 'SparkUI' on port 4041.
19/05/23 09:56:55 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.110:4041
19/05/23 09:56:55 INFO Executor: Starting executor ID driver on host localhost
19/05/23 09:56:55 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41997.
19/05/23 09:56:55 INFO NettyBlockTransferService: Server created on 192.168.0.110:41997
19/05/23 09:56:55 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/23 09:56:55 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.0.110, 41997, None)
19/05/23 09:56:55 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.0.110:41997 with 408.9 MB RAM, BlockManagerId(driver, 192.168.0.110, 41997, None)
19/05/23 09:56:55 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.0.110, 41997, None)
19/05/23 09:56:55 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.0.110, 41997, None)
19/05/23 09:56:56 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 107.2 KB, free 408.8 MB)
19/05/23 09:56:56 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 20.4 KB, free 408.8 MB)
19/05/23 09:56:56 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.0.110:41997 (size: 20.4 KB, free: 408.9 MB)
19/05/23 09:56:56 INFO SparkContext: Created broadcast 0 from textFile at Main.scala:11
19/05/23 09:56:56 INFO FileInputFormat: Total input paths to process : 1
19/05/23 09:56:56 INFO deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
19/05/23 09:56:56 INFO HadoopMapRedCommitProtocol: Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19/05/23 09:56:56 INFO SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
19/05/23 09:56:56 INFO DAGScheduler: Registering RDD 3 (map at Main.scala:15)
19/05/23 09:56:56 INFO DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 13 output partitions
19/05/23 09:56:56 INFO DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
19/05/23 09:56:56 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
19/05/23 09:56:56 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
19/05/23 09:56:56 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at map at Main.scala:15), which has no missing parents
19/05/23 09:56:56 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.7 KB, free 408.8 MB)
19/05/23 09:56:56 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.3 KB, free 408.8 MB)
19/05/23 09:56:56 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.0.110:41997 (size: 3.3 KB, free: 408.9 MB)
19/05/23 09:56:56 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1161
19/05/23 09:56:56 INFO DAGScheduler: Submitting 13 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at map at Main.scala:15) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12))
19/05/23 09:56:56 INFO TaskSchedulerImpl: Adding task set 0.0 with 13 tasks
19/05/23 09:56:56 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7419 bytes)
19/05/23 09:56:56 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
19/05/23 09:56:56 INFO HadoopRDD: Input split: file:/db/GEO/maxmind/2010-10.MaxMind GeoIP City CSV Format/GeoIP-139_20101001/GeoIPCity.csv:0+33554432
19/05/23 09:56:59 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1172 bytes result sent to driver
19/05/23 09:56:59 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7419 bytes)
19/05/23 09:56:59 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
..............
rmat/GeoIP-139_20101001/GeoIPCity-report/_temporary/0/task_20190523100430_0005_m_000010
19/05/23 10:05:13 INFO SparkHadoopMapRedUtil: attempt_20190523100430_0005_m_000010_0: Committed
19/05/23 10:05:13 INFO Executor: Finished task 10.0 in stage 1.0 (TID 23). 1465 bytes result sent to driver
19/05/23 10:05:13 INFO TaskSetManager: Starting task 11.0 in stage 1.0 (TID 24, localhost, executor driver, partition 11, ANY, 7141 bytes)
19/05/23 10:05:13 INFO TaskSetManager: Finished task 10.0 in stage 1.0 (TID 23) in 1030 ms on localhost (executor driver) (11/13)
19/05/23 10:05:13 INFO Executor: Running task 11.0 in stage 1.0 (TID 24)
19/05/23 10:05:13 INFO ShuffleBlockFetcherIterator: Getting 13 non-empty blocks including 13 local blocks and 0 remote blocks
19/05/23 10:05:13 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/05/23 10:05:14 INFO HadoopMapRedCommitProtocol: Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19/05/23 10:05:14 INFO FileOutputCommitter: Saved output of task 'attempt_20190523100430_0005_m_000011_0' to file:/db/GEO/maxmind/2010-10.MaxMind GeoIP City CSV Format/GeoIP-139_20101001/GeoIPCity-report/_temporary/0/task_20190523100430_0005_m_000011
19/05/23 10:05:14 INFO SparkHadoopMapRedUtil: attempt_20190523100430_0005_m_000011_0: Committed
19/05/23 10:05:14 INFO Executor: Finished task 11.0 in stage 1.0 (TID 24). 1465 bytes result sent to driver
19/05/23 10:05:14 INFO TaskSetManager: Starting task 12.0 in stage 1.0 (TID 25, localhost, executor driver, partition 12, ANY, 7141 bytes)
19/05/23 10:05:14 INFO Executor: Running task 12.0 in stage 1.0 (TID 25)
19/05/23 10:05:14 INFO TaskSetManager: Finished task 11.0 in stage 1.0 (TID 24) in 1171 ms on localhost (executor driver) (12/13)
19/05/23 10:05:14 INFO ShuffleBlockFetcherIterator: Getting 13 non-empty blocks including 13 local blocks and 0 remote blocks
19/05/23 10:05:14 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/05/23 10:05:15 INFO HadoopMapRedCommitProtocol: Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
19/05/23 10:05:15 INFO FileOutputCommitter: Saved output of task 'attempt_20190523100430_0005_m_000012_0' to file:/db/GEO/maxmind/2010-10.MaxMind GeoIP City CSV Format/GeoIP-139_20101001/GeoIPCity-report/_temporary/0/task_20190523100430_0005_m_000012
19/05/23 10:05:15 INFO SparkHadoopMapRedUtil: attempt_20190523100430_0005_m_000012_0: Committed
19/05/23 10:05:15 INFO Executor: Finished task 12.0 in stage 1.0 (TID 25). 1465 bytes result sent to driver
19/05/23 10:05:15 INFO TaskSetManager: Finished task 12.0 in stage 1.0 (TID 25) in 1036 ms on localhost (executor driver) (13/13)
19/05/23 10:05:15 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
19/05/23 10:05:15 INFO DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 14.524 s
19/05/23 10:05:15 INFO DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 45.735034 s
19/05/23 10:05:15 INFO SparkHadoopWriter: Job job_20190523100430_0005 committed.
OK19/05/23 10:05:15 WARN FileSystem: exception in the cleaner thread but it will continue to run
java.lang.InterruptedException
at java.base/java.lang.Object.wait(Native Method)
at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:176)
at org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner.run(FileSystem.java:2989)
at java.base/java.lang.Thread.run(Thread.java:834)
19/05/23 10:05:15 ERROR Utils: uncaught error in thread spark-listener-group-executorManagement, stopping SparkContext
java.lang.InterruptedException
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:97)
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:83)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:83)
19/05/23 10:05:15 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
at java.base/java.lang.Object.wait(Native Method)
at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
at org.apache.spark.ContextCleaner.$anonfun$keepCleaning$1(ContextCleaner.scala:181)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:179)
at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:73)
19/05/23 10:05:15 ERROR Utils: uncaught error in thread spark-listener-group-appStatus, stopping SparkContext
java.lang.InterruptedException
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:97)
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:83)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:83)
19/05/23 10:05:15 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-executorManagement
java.lang.InterruptedException
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:97)
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:83)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:83)
19/05/23 10:05:15 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-appStatus
java.lang.InterruptedException
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:97)
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:83)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:83)
19/05/23 10:05:15 INFO SparkContext: SparkContext already stopped.
19/05/23 10:05:15 INFO SparkUI: Stopped Spark web UI at http://192.168.0.110:4041
[success] Total time: 52 s, completed May 23, 2019, 10:05:15 AM
19/05/23 10:05:15 INFO DiskBlockManager: Shutdown hook called
19/05/23 10:05:15 INFO ShutdownHookManager: Shutdown hook called
19/05/23 10:05:15 INFO ShutdownHookManager: Deleting directory /tmp/spark-d0f8141d-9d13-45c9-b36c-c61b03755edd/userFiles-262e5486-7b4e-4839-a8a1-3c56eda9ee19
19/05/23 10:05:15 INFO ShutdownHookManager: Deleting directory /tmp/spark-d0f8141d-9d13-45c9-b36c-c61b03755edd
Отчоты.
1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32.
$ ls -la
total 195112
drwxr-xr-x 1 mayton mayton 692 May 23 10:05 .
drwxrwxrwx 1 mayton mayton 106 May 23 10:04 ..
-rw-r--r-- 1 mayton mayton 15250705 May 23 10:05 part-00000
-rw-r--r-- 1 mayton mayton 119156 May 23 10:05 .part-00000.crc
-rw-r--r-- 1 mayton mayton 15254781 May 23 10:05 part-00001
-rw-r--r-- 1 mayton mayton 119188 May 23 10:05 .part-00001.crc
-rw-r--r-- 1 mayton mayton 15245033 May 23 10:05 part-00002
-rw-r--r-- 1 mayton mayton 119112 May 23 10:05 .part-00002.crc
-rw-r--r-- 1 mayton mayton 15253456 May 23 10:05 part-00003
-rw-r--r-- 1 mayton mayton 119176 May 23 10:05 .part-00003.crc
-rw-r--r-- 1 mayton mayton 15251017 May 23 10:05 part-00004
-rw-r--r-- 1 mayton mayton 119160 May 23 10:05 .part-00004.crc
-rw-r--r-- 1 mayton mayton 15239248 May 23 10:05 part-00005
-rw-r--r-- 1 mayton mayton 119068 May 23 10:05 .part-00005.crc
-rw-r--r-- 1 mayton mayton 15232831 May 23 10:05 part-00006
-rw-r--r-- 1 mayton mayton 119016 May 23 10:05 .part-00006.crc
-rw-r--r-- 1 mayton mayton 15239873 May 23 10:05 part-00007
-rw-r--r-- 1 mayton mayton 119072 May 23 10:05 .part-00007.crc
-rw-r--r-- 1 mayton mayton 15227945 May 23 10:05 part-00008
-rw-r--r-- 1 mayton mayton 118980 May 23 10:05 .part-00008.crc
-rw-r--r-- 1 mayton mayton 15251852 May 23 10:05 part-00009
-rw-r--r-- 1 mayton mayton 119164 May 23 10:05 .part-00009.crc
-rw-r--r-- 1 mayton mayton 15235567 May 23 10:05 part-00010
-rw-r--r-- 1 mayton mayton 119036 May 23 10:05 .part-00010.crc
-rw-r--r-- 1 mayton mayton 15240895 May 23 10:05 part-00011
-rw-r--r-- 1 mayton mayton 119080 May 23 10:05 .part-00011.crc
-rw-r--r-- 1 mayton mayton 15253430 May 23 10:05 part-00012
-rw-r--r-- 1 mayton mayton 119176 May 23 10:05 .part-00012.crc
-rw-r--r-- 1 mayton mayton 0 May 23 10:05 _SUCCESS
-rw-r--r-- 1 mayton mayton 8 May 23 10:05 ._SUCCESS.crc
|
|