使用SparkLauncher提交spark程序時,有時間會一直卡在status是running狀態 finalstatus爲undefined,這種狀況是由於使用的java的ProcessBuilder,查閱了資料後發現多是由於buffer被填滿,致使進程的阻塞。java
官方文檔內也有相關提示:ide
By default, the created subprocess does not have its own terminal or console. All its standard I/O (i.e. stdin, stdout, stderr) operations will be redirected to the parent process, where they can be accessed via the streams obtained using the methods getOutputStream(), getInputStream(), and getErrorStream(). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
-------------------------------------------------------------------------------ui
默認狀況下,建立的子進程沒有本身的終端或控制檯。它的全部標準I/O(即,stdin、stdout、stderr)操做將被重定向到父進程,在那裏能夠經過使用方法getOutputStream()、getInputStream()和getErrorStream()得到的流來訪問它們。父進程使用這些流來輸入輸入並從子進程獲取輸出。因爲一些本機平臺僅對標準輸入和輸出流提供有限的緩衝區大小,所以不能及時寫入輸入流或讀取子進程的輸出流可能致使子進程阻塞,甚至死鎖。spa
因此咱們要本身開啓子線程讀取相應的buffer內容,最重要的是其中的getErrorStream(),通常spark的info信息都在這個裏面.線程