首页 -> 数据分析

python – 在磁盘数据库和快速内存数据库之...

流星雨 200 天前建立 •  166   

最近在研究轨迹数据的同行算法,数据量太大,两个月已经8000多万了,正在测试使用mysql分布.

http://www.cocoachina.com/articles/67422

import sqlite3

def copy_database(source_connection, dest_dbname=':memory:'):

'''Return a connection to a new copy of an existing database.

Raises an sqlite3.OperationalError if the destination already exists.

'''

script = ''.join(source_connection.iterdump())

dest_conn = sqlite3.connect(dest_dbname)

dest_conn.executescript(script)

return dest_conn

if __name__ == '__main__':

from contextlib import closing

with closing(sqlite3.connect('pepsearch.db')) as disk_db:


        mem_db = copy_database(disk_db)
    mem_db.execute('DELETE FROM documents WHERE uri="pep-3154"')
    mem_db.commit()
    copy_database(mem_db, 'changed.db').close()

回复

登录发表 or 还没有账号?去注册