本文主要是介绍pyspark 对用户的购买商品记录作出统计,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
统计一共售出多少个商品,一共有多少个顾客,所有商品的累加和,售卖最流行的商品
数据UserPurchaseHistory.csv
用户名称,商品名称,价格
John,iPhone Cover,9.99
John,Headphones,5.49
Jack,iPhone Cover,9.99
Jill,Samsung Galaxy Cover,8.95
Bob,iPad Cover,5.49
"""A simple Spark app in Python"""
from pyspark import SparkContextsc = SparkContext("local[2]", "First Spark App")
# we take the raw data in CSV format and convert it into a set of records of the form (user, product, price)
data = sc.textFile("data/UserPurchaseHistory.csv").map(lambda line: line.split(",")).map(lambda record: (record[0], record[1], record[2]))
# let's count the number of purchases
numPurchases = data.count()
# let's count how many unique users made purchases
uniqueUsers = data.map(lambda record: record[0]).distinct().count()
# let's sum up our total revenue
totalRevenue = data.map(lambda record: float(record[2])).sum()
# let's find our most popular product
products = data.map(lambda record: (record[1], 1.0)).reduceByKey(lambda a, b: a + b).collect()
mostPopular = sorted(products, key=lambda x: x[1], reverse=True)[0]
print mostPopular# Finally, print everything out
print "Total purchases: %d" % numPurchases
print "Unique users: %d" % uniqueUsers
print "Total revenue: %2.2f" % totalRevenue
print "Most popular product: %s with %d purchases" % (mostPopular[0], mostPopular[1])# stop the SparkContext
sc.stop()
这篇关于pyspark 对用户的购买商品记录作出统计的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!