第一個Spark實例:求PI值

向spark提交jar,需要使用 bin下的spark-submit

[hadoop@nbdo1 bin]$ ./spark-submit --help
Usage: spark-submit [options] <app jar | python file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]Options:--master MASTER_URL         spark://host:port, mesos://host:port, yarn, or local.--deploy-mode DEPLOY_MODE   Whether to launch the driver program locally ("client") oron one of the worker machines inside the cluster ("cluster")(Default: client).--class CLASS_NAME          Your application's main class (for Java / Scala apps).--name NAME                 A name of your application.--jars JARS                 Comma-separated list of local jars to include on the driverand executor classpaths.--packages                  Comma-separated list of maven coordinates of jars to includeon the driver and executor classpaths. Will search the localmaven repo, then maven central and any additional remoterepositories given by --repositories. The format for thecoordinates should be groupId:artifactId:version.--exclude-packages          Comma-separated list of groupId:artifactId, to exclude whileresolving the dependencies provided in --packages to avoiddependency conflicts.--repositories              Comma-separated list of additional remote repositories tosearch for the maven coordinates given with --packages.--py-files PY_FILES         Comma-separated list of .zip, .egg, or .py files to placeon the PYTHONPATH for Python apps.--files FILES               Comma-separated list of files to be placed in the workingdirectory of each executor.--conf PROP=VALUE           Arbitrary Spark configuration property.--properties-file FILE      Path to a file from which to load extra properties. If notspecified, this will look for conf/spark-defaults.conf.--driver-memory MEM         Memory for driver (e.g. 1000M, 2G) (Default: 1024M).--driver-java-options       Extra Java options to pass to the driver.--driver-library-path       Extra library path entries to pass to the driver.--driver-class-path         Extra class path entries to pass to the driver. Note thatjars added with --jars are automatically included in theclasspath.--executor-memory MEM       Memory per executor (e.g. 1000M, 2G) (Default: 1G).--proxy-user NAME           User to impersonate when submitting the application.This argument does not work with --principal / --keytab.--help, -h                  Show this help message and exit.--verbose, -v               Print additional debug output.--version,                  Print the version of current Spark.Spark standalone with cluster deploy mode only:--driver-cores NUM          Cores for driver (Default: 1).Spark standalone or Mesos with cluster deploy mode only:--supervise                 If given, restarts the driver on failure.--kill SUBMISSION_ID        If given, kills the driver specified.--status SUBMISSION_ID      If given, requests the status of the driver specified.Spark standalone and Mesos only:--total-executor-cores NUM  Total cores for all executors.Spark standalone and YARN only:--executor-cores NUM        Number of cores per executor. (Default: 1 in YARN mode,or all available cores on the worker in standalone mode)YARN-only:--driver-cores NUM          Number of cores used by the driver, only in cluster mode(Default: 1).--queue QUEUE_NAME          The YARN queue to submit to (Default: "default").--num-executors NUM         Number of executors to launch (Default: 2).If dynamic allocation is enabled, the initial number ofexecutors will be at least NUM.--archives ARCHIVES         Comma separated list of archives to be extracted into theworking directory of each executor.--principal PRINCIPAL       Principal to be used to login to KDC, while running onsecure HDFS.--keytab KEYTAB             The full path to the file that contains the keytab for theprincipal specified above. This keytab will be copied tothe node running the Application Master via the SecureDistributed Cache, for renewing the login tickets and thedelegation tokens periodically.


求PI值

spark內置了一些實例,在/home/hadoop/spark/examples/jars/spark-examples_2.11-2.1.1.jar 里
(不同版本位置有所區別)

該算法是利用蒙特·卡羅算法求PI

http://blog.csdn.net/xmmxjycqupt/article/details/9320627

https://www.zhihu.com/question/20254139


命令:

./spark-submit --master spark://nbdo1:7077 --class org.apache.spark.examples.SparkPi /home/hadoop/spark/examples/jars/spark-examples_2.11-2.1.1.jar 100




運行結果

.........

17/05/15 16:46:31 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool?
17/05/15 16:46:31 INFO DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) finished in 5.875 s
17/05/15 16:46:31 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:38, took 6.407038 s
Pi is roughly 3.1420331142033113
17/05/15 16:46:31 INFO SparkUI: Stopped Spark web UI at http://192.168.18.146:4040
17/05/15 16:46:31 INFO StandaloneSchedulerBackend: Shutting down all executors
17/05/15 16:46:31 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down

........



-------------

更多的Java,Android,大數據,J2EE,Python,數據庫,Linux,Java架構師,教程,視頻請訪問:

http://www.cnblogs.com/zengmiaogen/p/7083694.html



本文來自互聯網用戶投稿,該文觀點僅代表作者本人,不代表本站立場。本站僅提供信息存儲空間服務,不擁有所有權,不承擔相關法律責任。
如若轉載,請注明出處:http://www.pswp.cn/news/538584.shtml
繁體地址,請注明出處:http://hk.pswp.cn/news/538584.shtml
英文地址,請注明出處:http://en.pswp.cn/news/538584.shtml

如若內容造成侵權/違法違規/事實不符,請聯系多彩編程網進行投訴反饋email:809451989@qq.com,一經查實,立即刪除!

相關文章

go conn 讀取byte數組后是否要_【技術推薦】正向角度看Go逆向

Go語言具有開發效率高&#xff0c;運行速度快&#xff0c;跨平臺等優點&#xff0c;因此正越來越多的被攻擊者所使用&#xff0c;其生成的是可直接運行的二進制文件&#xff0c;因此對它的分析類似于普通C語言可執行文件分析&#xff0c;但是又有所不同&#xff0c;本文將會使用…

Confluence 6 選擇一個外部數據庫

2019獨角獸企業重金招聘Python工程師標準>>> 注意&#xff1a; 選擇一個合適的數據庫通常需要花費很多時間。同時 Confluence 自帶的 XML 數據備份和恢復功能通常也不適合合并和備份有大量數據的數據庫。如果你想在系統運行后進行數據合并&#xff0c;你通常需要使用…

spark中saveAsTextFile如何最終生成一個文件

原文地址&#xff1a;http://www.cnblogs.com/029zz010buct/p/4685173.html ----------------------------------------------------------------------- 一般而言&#xff0c;saveAsTextFile會按照執行task的多少生成多少個文件&#xff0c;比如part-00000一直到part-0000n&…

python爬取內容亂碼_python爬取html中文亂碼

環境&#xff1a; python3.6 爬取代碼&#xff1a; import requests url https://www.dygod.net/html/tv/hytv/ req requests.get(url) print(req.text) 爬取結果&#xff1a; / _-如上&#xff0c;title內容出現亂碼&#xff0c;自己感覺應該是編碼的問題&#xff0c;但是不…

前端每日實戰:34# 視頻演示如何用純 CSS 創作在文本前后穿梭的邊框

效果預覽 按下右側的“點擊預覽”按鈕可以在當前頁面預覽&#xff0c;點擊鏈接可以全屏預覽。 https://codepen.io/comehope/pen/qYepNv 可交互視頻教程 此視頻是可以交互的&#xff0c;你可以隨時暫停視頻&#xff0c;編輯視頻中的代碼。 請用 chrome, safari, edge 打開觀看。…

not support mysql_MYSQL出現quot; Client does not support authentication quot;的解決方法

MYSQL 幫助&#xff1a;A.2.3 Client does not support authentication protocolMySQL 4.1 and up uses an authentication protocol based on a password hashing algorithm that is incompatible with that used by older clients. If you upgrade the server to 4.1, attemp…

spark shell中編寫WordCount程序

啟動hdfs 略http://blog.csdn.net/zengmingen/article/details/53006541 啟動spark 略安裝&#xff1a;http://blog.csdn.net/zengmingen/article/details/72123717 spark-shell&#xff1a;http://blog.csdn.net/zengmingen/article/details/72162821準備數據 vi wordcount.t…

初級英語02

做客 1 Diana,i havent seen you for ages,how have you been? 2 would you like something to drink? 3 give my best to your parents. 4 did you hear what happened?whats the matter with him? 5 id like to applogize for leaving so early,i brought a little gift,…

mysql計算機二級選擇題題庫_全國計算機二級mysql數據庫選擇題及答案

全國計算機二級mysql數據庫選擇題及答案選擇題是全國計算機二級mysql考試里的送分題&#xff0c;下面小編為大家帶來了全國計算機二級mysql數據庫選擇題及答案&#xff0c;歡迎大家閱讀&#xff01;全國計算機二級mysql數據庫選擇題及答案1) 函數 max( ) 表明這是一個什么函數?…

git add 撤銷_更科學地管理你的項目,Git 簡明教程(二)

修改文件內容上回說到&#xff0c;我們已經成功創建并提交了一個 README.md 文件到 FirstGit 版本庫中1、修改文件現在我們更改 README.md 內容2、查看版本庫狀態該文件夾內右鍵運行 Git Bash Here執行命令 git statusGit 提示我們的改動還沒有 commit&#xff0c;并且它給出了…

Eclipse中Copy Qualified Name復制類全名解決辦法

原文鏈接&#xff1a;http://www.cnblogs.com/zyh1994/p/6393550.html ----------------------------------------------------------------------------------------------- Eclipse中 用Copy Qualified Name復制類全名時 總是這樣的/struts1/src/me/edu/HelloAction.java很不…

c 連接mysql錯誤信息_使用C語言訪問MySQL數據 —— 連接和錯誤處理

2011-05-09 wcdj可以通過許多不同的編程語言來訪問MySQL&#xff0c;例如&#xff0c;C&#xff0c;C&#xff0c;Java&#xff0c;Perl&#xff0c;Python&#xff0c;Tcl&#xff0c;PHP等。本文主要總結使用C語言接口如何訪問MySQL數據。(一) 連接例程(二) 錯誤處理(一) 連接…

eclipse編寫wordcount提交spark運行

采用集成了scala的eclipse編寫代碼 代碼&#xff1a; package wordcountimport org.apache.spark.SparkConf import org.apache.spark.SparkContextobject WordCount {def main(args: Array[String]): Unit {//非常重要&#xff0c;是通向Spark集群的入口val confnew SparkCon…

gitlab 刪除分支_如何刪除gitlab上默認受保護的master主分支

今天開發在檢查代碼的時候&#xff0c;發現master分支有問題&#xff0c;現在準備刪除此主分支&#xff0c;并且重新提交正確的代碼&#xff0c;不過在刪除時發現&#xff0c;master分支不能被刪除。ps&#xff1a;主分支一般都是線上分支&#xff0c;需要開發確認后并且做好備…

rsync服務擴展應用

rsync服務擴展應用① 守護進程多模塊功能配置第一步&#xff1a;修改配置文件 注&#xff1a;可以再vim中輸入&#xff1a;20,22copy22&#xff0c;表示復制20到22行到22行之后 vim /etc/rsyncd.conf[backup01]comment "backup dir by oldboy"path /backup[backup0…

NodeJs 安裝

進入官網下載&#xff0c;zip 安裝包 https://nodejs.org/en/download/ 解壓 配置環境變量到安裝目錄 cmd 測試 node -v npm -v

SSH秘鑰登錄服務器

一、查看本機 ssh 公鑰&#xff0c;生成公鑰 1.通過命令窗口 a. 打開你的 git bash 窗口 b. 進入 .ssh 目錄&#xff1a;cd ~/.ssh c. 找到 id_rsa.pub 文件&#xff1a;ls d. 查看公鑰&#xff1a;cat id_rsa.pub 或者 vim id_rsa.pub git–查看本機 ssh 公鑰&#xff0c…

mysql存入mtr數據_mysql mtr寫入數據

selenium 打開瀏覽器import org.openqa.selenium.By; import org.openqa.selenium.WebDriver; import org.openqa.selenium.WebE ...Win8&period;1安裝Visual Studio 2015提示需要KB2919355http://www.microsoft.com/zh-cn/download/details.aspx?id42335 安裝說明: 1.若要…

diff git 代碼實現_Git 自救指南:這些坑你都跳得出嗎?

每天都會寫架構師文章&#xff0c;Java技術文章天天更新&#xff0c;感興趣的點個關注再走唄&#xff01;Git 雖然因其分布式管理方式&#xff0c;不完全依賴網絡&#xff0c;良好的分支策略&#xff0c;容易部署等優點&#xff0c;已經成為最受歡迎的源代碼管理方式。但是一分…

HDU 4812 D Tree

HDU 4812 思路&#xff1a; 點分治 先預處理好1e6 3以內到逆元 然后用map 映射以分治點為起點的鏈的值a 成他的下標 u 然后暴力跑出以分治點兒子為起點的鏈的值b&#xff0c;然后在map里查找inv[b]*k 代碼&#xff1a; #include<bits/stdc.h> using namespace std; #d…