- 5月 23 週日 202117:12
[Python] 從中央氣象局下載地震活動彙整列表
- 3月 12 週五 202121:37
[Python] 關於youtube-dl
單一檔案下載: 呼叫 download_video(download_url, destFileFolder), 給影片來源 download_url 及儲存目錄 destFileFolde , 例如 main() 裡面的這行:
- 6月 28 週日 202016:01
[tensorflow] 在v2下使用v1
- 4月 13 週一 202016:16
[python] 咖啡豆去背

咖啡豆辨識, 先設計打光場景, 背景用單色, 先用HSV色彩過濾得到一個初步遮罩, 再用 morphology處理遮罩, 把遮罩內的洞補起來, 並且縮小邊緣, 以獲得完全不含背景色的圖片
- 3月 10 週二 202022:09
[Python] google map API
最近熱門的藥局領口罩, 有公開的CSV可以下載, 用python簡單就可抓取了, 觀察了一下資料,更新速度還蠻快的, 但是有住址而無GPS座標, 如果要用地圖, 需要一個轉換, 一次轉換那麼多點要花很多錢, 最好是用cache的方式, 有抓到就先存起來, 要查詢時先去cache找, 沒有再去查詢.
- 12月 12 週三 201813:45
[AI] keras-yolo3 連接 ipcam 進行物件辨識
接續前一篇 [AI] keras-yolo3 測試 這篇說明如何連接具有 onvif 協定之 ipcam.
要連接 ipcam 首先要找到連接的 uri,如果手冊沒有,可以先下載 ONVIF Device Manager,先用瀏覽器登入 ipcam 後找到 port 設定,然後手動方式將 ipcam 網址加入 ONVIF Device Manager 的裝置列表,輸入帳號密碼,然後手動連線,點選中間欄位的 Live Video, 影像串流就會顯示在右方窗格,下方即可看到 uri, 例如:
- 12月 09 週日 201822:26
[AI] keras-yolo3 測試
YOLO (You Only Look Once) 是關於物件偵測的類神經網路演算法,
作者 Joseph Redmon 以 darknet 架構實作,輕量、依賴少、演算法高效率,在工業應用領域很有價值,例如行人偵測、工業影像偵測等等。官網寫的非常詳盡,照著操作便能完成 Yolo 初步的 detection 和 training,但這裡只測試另一款使用 keras 架構實作的 keras-yolo3 的操作步驟。- 11月 07 週三 201822:40
[Python] 以tensorflow分析Mw的雙高斯混合模型
- 10月 15 週一 201816:25
[Python] 從中央氣象局下載地震活動彙整列表
氣象局網站的 地震活動彙整 列表, 檢視網頁內容, 真正資料網頁連結為 https://scweb.cwb.gov.tw/Page.aspx/?ItemId=20&loc=tw&adv=1, 因為是ASP網頁, 所以需先取出ASP的傳遞參數, 資料表格之 class 為 datalist4, 使用 BeautifulSoup 即可取出表格內容
- 10月 07 週日 201817:45
[Python] 從USGS下載地震資料
# -*- coding: utf-8 -*-
"""
Created on Sun Oct 7 11:53:25 2018
@author: ghosty
@Program: usus_location_list.py
@Prupose: download USGS data and analyze the quake location name for
(1) location relation analysis
(2) translation from English to Traditional Chinese, used for AI analysis tool development
"""
import csv
import requests
import os.path
import datetime
#import dateutil
from dateutil.relativedelta import relativedelta
#init global variables
locFileName = 'usgs_loc.txt'
locationList=[]
quakeList=[]
#
def downloadUSGS(starttime, endtime, minmagnitude, maxmagnitude): #,latitude=121.1,longitude=23.5,maxradiuskm=20000):
url = 'http://earthquake.usgs.gov/fdsnws/event/1/query'
query = {
'format': 'csv',
'starttime': starttime,
'endtime': endtime,
'minmagnitude': '5',
'eventtype':'earthquake'
}
response = requests.get(url, params=query)
if response.status_code != requests.codes.ok:
print("USGS requesr fail")
return
#print("encoding: %s" % response.encoding)
quakeList = list(csv.reader(response.text.split('\n'), delimiter=','))
header=quakeList[0]
#time,latitude,longitude,depth,mag,magType,nst,gap,dmin,rms,net,id,updated,place,type,horizontalError,depthError,magError,magNst,status,locationSource,magSource
quakeList.pop(0) #remove header line
print('USGS download #',len(quakeList))
return quakeList, header
def addLocation(quakeData):
for quake in quakeData:
#print('quake:',quake)
if (len(quake)==0): #skip null record
continue
quakeTime = quake[0]
quakeLatitude = quake[1]
quakeLongitude = quake[2]
quakeDepth = quake[3]
quakeMag = quake[4]
quakeMagType = quake[5]
quakePlace = quake[13]
#print(quakeTime, quakeLatitude, quakeLongitude, quakeDepth, quakeMag, quakeMagType, quakePlace)
locs = quakePlace.split(',')
#print(locs) #debug for 2013-02-01
for loc in locs:
try:
ofPos = loc.find(" of ") #do not usr 'of' only to prevent error
if (ofPos>0):
loc=loc[ofPos+4:]
#loc.replace("the ","")
loc.replace("the ","")
#loc.strip() #remove space, strip() fail to use
while (loc[0]==' '):
loc=loc[1:]
if (loc not in locationList):
quake = '{0} {1} {2} {3} {4} {5} {6}'.format(quakeTime, quakeLatitude, quakeLongitude, quakeDepth, quakeMag, quakeMagType, quakePlace)
print("Add [",loc, '] from [',quake,']')
quakeList.append(quake)
locationList.append(loc)
#print(len(quakeList),len(locationList))
except:
#source data error
print('parse fail: ',quakeTime, quakeLatitude, quakeLongitude, quakeDepth, quakeMag, quakeMagType, quakePlace)
def saveLocation(locationList, locFileName):
#print(len(locationList))
print("export ",len(locationList)," locations")
with open(locFileName, 'w') as locfile:
for i in range(len(locationList)):
locfile.write('\"'+str(quakeList[i]) + '\"$\"' + str(locationList[i]) + '\"\n')
def loadLocation(locationList, locFileName):
if (os.path.exists(locFileName)):
with open (locFileName, 'r') as locfile:
for line in locfile:
quakeData = line.strip().split('$')
quakeList.append(quakeData[0].strip('\"'))
locationList.append(quakeData[1].strip('\"'))
print("import ",len(locationList)," locations")
#------main()--------
# Load processed location data
loadLocation(locationList, locFileName)
#for all date
dt1 = datetime.date(1900, 1, 1)
#end_dt = datetime.date(2000, 1, 31)
end_dt = datetime.date.today()
while (dt1<end_dt): # in daterange(start_dt, end_dt):
"""
# if there are too many records at one download, reduce the download period
if (dt1 < datetime.date(2018, 1, 1)):
delta_dt = relativedelta(years=1)
else:
delta_dt = relativedelta(months=1)
"""
delta_dt = relativedelta(years=1)
dt2 = dt1 + delta_dt
print("\nProcessing...",dt1.strftime("%Y-%m-%d"),"~",dt2.strftime("%Y-%m-%d"))
quakeDATA, header = downloadUSGS(dt1, dt2, 4.0, 10)
addLocation(quakeDATA)
dt1 = dt2
#save processed result
saveLocation(locationList, locFileName)