如何实时将数据从arduino传感器存储到hadoop hdfs

olqngx59  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(390)

我正在从事一个涉及使用hdfs的项目,我想每隔3s将arduino数据存储到hadoophdfs的csv文件中。
csv文件示例:
'temp1'、'datetime1'、'location1'
'temp2'、'datetime2'、'location2'
'temp3'、'datetime3'、'location3'
每隔3秒,我想在这个csv文件中添加一行。
我已经尝试了一个python代码,它从arduino的串行端口读取并写入nosql数据库,我也尝试了同样的操作,但是在hdfs路径上发现了一些问题。


# Creating a simple Pandas DataFrame

liste_temp = [temp_string,datetime.datetime.now(),temperature_location]
df = pd.DataFrame(data = {'temp' : liste_temp})

# Writing Dataframe to hdfs

with client_hdfs.write('/test/temp.csv', encoding = 'utf-8') as writer:
                df.to_csv(writer)

错误:

File "templog.py", line 33, in <module> with client_hdfs.write('/test/temp.csv', encoding = 'utf-8') as writer: File "C:\Users\nouhl\AppData\Local\Programs\Python\Python37-32\lib\site-packages\hdfs\client.py", line 460, in write raise
InvalidSchema("No connection adapters were found for '%s'" % url) requests.exceptions.InvalidSchema: No connection adapters were found for 'hdfs://localhost:9870/webhdfs/v1/test/temp.csv

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题