Warning: file_exists(): open_basedir restriction in effect. File(/srv/http/vhosts/aur.archlinux.org/public/web/locale//en/LC_MESSAGES/aurweb.mo) is not within the allowed path(s): (/srv/http/vhosts/aur-dev.archlinux.org/:/etc/aurweb/) in /srv/http/vhosts/aur-dev.archlinux.org/public/web/lib/streams.php on line 90
AUR (en) - apache-spark

Notice: Undefined variable: name in /srv/http/vhosts/aur-dev.archlinux.org/public/web/lib/pkgfuncs.inc.php on line 248

Package Details: apache-spark 2.1.0-1

Git Clone URL: https://aur-dev.archlinux.org/apache-spark.git (read-only)
Package Base: apache-spark
Description: fast and general engine for large-scale data processing
Upstream URL: http://spark.apache.org
Licenses: Apache
Submitter: huitseeker
Maintainer: huitseeker
Last Packager: huitseeker
Votes: 12
Popularity: 1.532655
First Submitted: 2015-10-04 09:31
Last Updated: 2017-01-16 22:37

Dependencies (8)

Required by (0)

Sources (7)

Latest Comments

1 2 3 Next › Last »

adouzzy commented on 2017-01-16 22:29

Please update to 2.1.0. Cheers

huitseeker commented on 2016-12-29 18:40

@steph.schie updated!

steph.schie commented on 2016-12-27 14:33

Why is hadoop a dependency? I don't need and want to use hadoop with spark.

huitseeker commented on 2016-11-18 07:45

There should be a file /etc/profile.d/apache-spark.sh that sets the $SPARK_HOME env variable for you correctly. If this is set up right, you should need no patching of binaries. To test that assumption, first check the value of $SPARK_HOME (should be /opt/apache-spark) and then run sh /opt/apache-spark/bin/load-spark-env.sh. Report (with as much complete information as possible) if you see an error then.

TaXules commented on 2016-11-08 16:30

To fix the error "ls: cannot access '/usr/assembly/target/scala-2.10': No such …", you must patch spark bins by running:
"sed -i 's/`dirname "$0"`/`dirname "$(readlink -f $0)"`/g' /opt/apache-spark/bin/*"
(readlink is in coreutils)

mtrokic commented on 2016-08-13 17:20

I think there are dependencies which are missing. After installing gcc-fortran and postgresql-libs I was able to compile successfully.

sidec commented on 2016-07-10 20:16

After successful makepkg -sri I fail to run spark-shell, I get this message insteed:

ls: cannot access '/usr/assembly/target/scala-2.10': No such file or directory Failed to find Spark assembly in /usr/assembly/target/scala-2.10. You need to build Spark before running this program.

axelmagn commented on 2016-06-29 19:24

I can confirm discord's issue. I am having the same problem.

huitseeker commented on 2016-05-03 15:40

I have trouble reproducing your issue, sorry.

discord commented on 2016-04-26 18:31

Considering removing hive from the pom build and re-building, since I don't use it. However not sure why this works for anyone except myself.