* google drive API
open a service for development
https://console.developers.google.com/project
document for python
https://developers.google.com/drive/web/quickstart/quickstart-python
-- sample copy a document to gdrive
#!/usr/bin/python
#Copy a document to google drive.
import httplib2
import pprint
from apiclient.discovery import build
from...
2014년 8월 29일 금요일
2014년 8월 20일 수요일
Image analysis with Python
Posted on 8/20/2014 09:11:00 오후 by 마크 with No comments
1. matplotlib ( is a python 2D plotting library )
- site (api)
http://matplotlib.org/api/pyplot_api.html
2. opencv (is an open source computer vision and machine learning software library.)
- site (api)
http://opencv.org/
- template matching
http://docs.opencv.org/doc/tutorials/imgproc/histograms/template_matching/template_matching.html#which-are-the-matching-methods-available-in-opencv
3....
2014년 8월 18일 월요일
somethings about linux
Posted on 8/18/2014 08:22:00 오후 by 마크 with No comments
Referred link:
http://www.tecmint.com/install-google-chrome-on-redhat-centos-fedora-linux/
Step 1: Enable Google YUM repository
Create a file called /etc/yum.repos.d/google-chrome.repo and add the following lines of code to it.
[google-chrome]
name=google-chrome
baseurl=http://dl.google.com/linux/chrome/rpm/stable/$basearch
enabled=1
gpgcheck=1
gpgkey=https://dl-ssl.google.com/linux/linux_signing_key.pub
Step...
2014년 8월 12일 화요일
for using Postgres-XL
Posted on 8/12/2014 01:13:00 오후 by 마크 with No comments

INSTALL
-- init each instances
initgtm -Z gtm -D /var/lib/pgxl/9.2/data_gtm
initdb -D /var/lib/pgxl/9.2/coord01 --nodename coord01
initdb -D /var/lib/pgxl/9.2/data01 --nodename data01
initdb -D /var/lib/pgxl/9.2/data02 --nodename data02
-- start each instances
gtm_ctl -Z gtm...
2014년 8월 8일 금요일
sample code for collecting a data.
Posted on 8/08/2014 04:24:00 오후 by 마크 with No comments
1. data collection using python
# python library for pulling data out of html or xml
# http://www.crummy.com/software/BeautifulSoup/bs4/doc/index.html
-- pulling_data.py
import codecs
import urllib2
from bs4 import BeautifulSoup
f = urllib2.urlopen('http://www.daum.net')
html_doc = f.read()
soup = BeautifulSoup(html_doc)
# for hangul
with codecs.open('result_daum.txt','w',encoding='utf8')...
sample code to control the data in hadoop framework.
Posted on 8/08/2014 01:17:00 오후 by 마크 with No comments
1. flume
-- fox.conf
# Name the components on this agent
# fox -> zoo -> koala
agent.sinks = koala
agent.sources = fox
agent.channels = zoo
# Describe/configure the source
agent.sources.fox.type = spooldir
agent.sources.fox.spoolDir = /home/flume/dump
# Describe the sink
agent.sinks.koala.type = hdfs
agent.sinks.koala.hdfs.path = /flume/events
agent.sinks.koala.hdfs.fileType...
Popular Baby Names Top 50 since 1980
Posted on 8/08/2014 12:50:00 오후 by 마크 with No comments

I am studying about data analysis with R.
First I thought how many people used my name and else.
The word cloud showed me a best visualization.
The histogram plot about my name - Mark.
R code tested by R version 3.1.1, RStudio Version 0.98.978
# national poplular...
2014년 7월 15일 화요일
mongoDB usable scripts
Posted on 7/15/2014 07:41:00 오후 by 마크 with No comments
# get an average of collection with some conditions.
db.POINT_TOTAL_OBS_STATION_DATA.group(
{ cond: { obs_item_id : "OBSCD00074" }
, initial: {count: 0, total:0}
, reduce: function(doc, out) { out.count++ ; out.total += doc.v1 }
, finalize: function(out) { out.avg = out.total / out.count }
}...
2014년 7월 7일 월요일
R sample code for
Posted on 7/07/2014 02:48:00 오후 by 마크 with No comments
# R enviroments
-- to change java heap size
-- R_HOME/etc/Rprofile.site
options(java.parameters = c("-Xmx16g","-Dfile.encoding=UTF-8"))
-- to read UTF-8 type.
f <- file("d:/parser.txt", blocking=F,encoding="UTF-8")
txtLines <- readLines(f)
# 1. how to collect a stock info.
install.packages("fImport")
library(fImport)
s_e <- yahooSeries("005935.KS")
plot(s_e)
#...
2014년 4월 28일 월요일
how to use bzip2 that supports a splittable compression with hadoop-streaming package.
Posted on 4/28/2014 03:11:00 오후 by 마크 with No comments
sample codes for testing
1. compression
# To make a compress files per block by mapper.
hadoop jar hadoop-streaming-2.2.0.2.1.0.0-92.jar \
-D mapreduce.output.fileoutputformat.compress=TRUE \
-D mapreduce.output.fileoutputformat.compress.type=RECORD \
-D mapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.BZip2Codec...
피드 구독하기:
글 (Atom)