Search Results

Search found 27118 results on 1085 pages for 'mysql python'.

Page 163/1085 | < Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >

  • What's a better choice for SQL-backed number crunching - Ruby 1.9, Python 2, Python 3, or PHP 5.3?

    - by Ivan
    Crterias of 'better': fast im math and simple (little of fields, many records) db transactions, convenient to develop/read/extend, flexible, connectible. The task is to use a common web development scripting language to process and calculate long time series and multidimensional surfaces (mostly selectint/inserting sets of floats and dong maths with rhem). The choice is Ruby 1.9, Python 2, Python 3, PHP 5.3, Perl 5.12, JavaScript (node.js). All the data is to be stored in a relational database (due to its heavily multidimensional nature), all the communication with outer world is to be done by means of web services.

    Read the article

  • Python: User-Defined Exception That Proves The Rule

    - by bandana
    Python documentations states: Exceptions should typically be derived from the Exception class, either directly or indirectly. the word 'typically' leaves me in an ambiguous state. consider the code: class good(Exception): pass class bad(object): pass Heaven = good() Hell = bad() >>> raise Heaven Traceback (most recent call last): File "<pyshell#163>", line 1, in <module> raise Heaven good >>> raise Hell Traceback (most recent call last): File "<pyshell#171>", line 1, in <module> raise Hell TypeError: exceptions must be classes or instances, not bad so when reading the python docs, should i change 'typically' with ''? what if i have a class hierarchy that has nothing to do with the Exception class, and i want to 'raise' objects belonging to the hierarchy? i can always raise an exception with an argument: raise Exception, Hell This seems slightly awkward to me What's so special about the Exception class, that only its family members can be raised?

    Read the article

  • Python json memory bloat

    - by Anoop
    import json import time from itertools import count def keygen(size): for i in count(1): s = str(i) yield '0' * (size - len(s)) + str(s) def jsontest(num): keys = keygen(20) kvjson = json.dumps(dict((keys.next(), '0' * 200) for i in range(num))) kvpairs = json.loads(kvjson) del kvpairs # Not required. Just to check if it makes any difference print 'load completed' jsontest(500000) while 1: time.sleep(1) Linux top indicates that the python process holds ~450Mb of RAM after completion of 'jsontest' function. If the call to 'json.loads' is omitted then this issue is not observed. A gc.collect after this function execution does releases the memory. Looks like the memory is not held in any caches or python's internal memory allocator as explicit call to gc.collect is releasing memory. Is this happening because the threshold for garbage collection (700, 10, 10) was never reached ? I did put some code after jsontest to simulate threshold. But it didn't help.

    Read the article

  • Group MySQL Data into Arbitrarily Sized Time Buckets

    - by Eric J.
    How do I count the number of records in a MySQL table based on a timestamp column per unit of time where the unit of time is arbitrary? Specifically, I want to count how many record's timestamps fell into 15 minute buckets during a given interval. I understand how to do this in buckets of 1 second, 1 minute, 1 hour, 1 day etc. using MySQL date functions, e.g. SELECT YEAR(datefield) Y, MONTH(datefield) M, DAY(datefield) D, COUNT(*) Cnt FROM mytable GROUP BY YEAR(datefield), MONTH(datefield), DAY(datefield) but how can I group by 15 minute buckets?

    Read the article

  • Organize array in PHP from mysql

    - by Matthew Carter
    Hi i have a social networking website. what i want it to do is pull out my friends status updates. basically what it does is i have a mysql query that pulls out all of my friends and in that while loop there is another mysql query that pulls out the status's from my friends. i want it to be in order of date but since its one while loop in another what it does is pull out all status's from friend 1 then 2 then 3 and not in order by date. i even tried ORDER BY DATE but that just ordered it by date within the friend.. my thought is that i could putt it all in an array and friends is one thing and the values is the stats. then just sort by values would this work and how could i do it. THANKS SO MUCH

    Read the article

  • Mysql query taking too much time

    - by aditya
    I have problem related to mysql database. i am linux webserver admin and i am facing a problem with a mysql query. The database is very small. I tried to track in logs and found that a query is taking minimum 5 sec to respond . The first page of site is coming from the database. Client are using cms. when the server gets some number of hits database server starts to give response very slowly and wait time increases from 5 sec to several seconds. I checked slow query logs { Query_time: 11.480138 Lock_time: 0.003837 Rows_sent: 921 Rows_examined: 3333 SET timestamp=1346656767; SELECT `Tender`.`id`, `Tender`.`department_id`, `Tender`.`title_english`, `Tender`.`content_english`, `Tender`.`title_hindi`, `Tender`.`content_hindi`, `Tender`.`file_name`, `Tender`.`start_publish`, `Tender`.`end_publish`, `Tender`.`publish`, `Tender`.`status`, `Tender`.`createdBy`, `Tender`.`created`, `Tender`.`modifyBy`, `Tender`.`modified` FROM `mcms_tenders` AS `Tender` WHERE `Tender`.`department_id` IN ( 31, 33, 32, 30 ); } Every line in the log is same only there is diff in Query time. Is there any way tweak the performance?

    Read the article

  • Partial Upload With storbinary in python

    - by brian
    I've written some python code to download an image using urllib.urlopen().read() and then upload it to an FTP site using ftplib.FTP().storbinary() but I'm having a problem. Sometimes the image file is only partially uploaded, so I get images with the bottom 20% or so cut off. I've checked the locally downloaded version and I have successfully downloaded the entire image, which leads me to believe that it is a problem with storbinary. I believe I am opening and closing all of the files correctly. Does anyone have any clues as to why I'm getting a partial upload with storbinary? Update: When I run through the commands in the Python shell, the upload completes successfully, I don't know why it would be different from when run as a script...

    Read the article

  • Integrating TFS and MySQL

    - by user294043
    We are developing an application with Visual Studio 2008 and TFS. Our database is a MySQL DB. As we develop we keep the new queries that need to be applied to the database of our new release as the New Version Update Queries. Right now I'm keeping them in a simple text file (which is a painful task!). I know that TFS integrates with MSSQL and makes this job very easy. I've already asked our consultant from Microsoft if there is any way to integrate TFS and MySQL, and his answer was "NO". So I was wondering if anyone knows any smart way of handling this issue?

    Read the article

  • Python: Attractive, clean, packagable windows GUI library

    - by Parand
    I need to create a simple windows based GUI for a desktop application that will be downloaded by end users. The application is written in python and will be packaged as an installer or executable. The functionality I need is simple - selecting from various lists, showing progress bars, etc. No animations, sprites, or other taxing/exotic things. Seems there are quite a few options for Python GUI libraries (Tk, QT, wxPython, Gtk, etc). What do you recommend that: Is easy to learn and maintain Can be cleanly packaged using py2exe or something similar Looks nice

    Read the article

  • python: calling constructor from dictionary?

    - by Jason S
    I'm not quite sure of the terminology here so please bear with me.... Let's say I have a constructor call like this: machineSpecificEnvironment = Environment( TI_C28_ROOT = 'C:/appl/ti/ccs/4.1.1/ccsv4/tools/compiler/c2000', JSDB = 'c:/bin/jsdb/jsdb.exe', PYTHON_PATH = 'c:/appl/python/2.6.4', ) except I would like to replace that by an operation on a dictionary provided to me: keys = {'TI_C28_ROOT': 'C:/appl/ti/ccs/4.1.1/ccsv4/tools/compiler/c2000', 'JSDB': 'c:/bin/jsdb/jsdb.exe', 'PYTHON_PATH': 'c:/appl/python/2.6.4'} machineSpecificEnvironment = Environment( ... what do I put here? it needs to be a function of "keys" ... ) How can I do this?

    Read the article

  • SubSonic isn't generating MySql foreign key tables

    - by keith
    I two tables within a MySql 5.1.34 database. When using SubSonic to generate the DAL, the foreign-key relationship doesn't get scripted, ie; I have no Parent.ChildCollection object. Looking inside the generated DAL Parent class shows the following; //no foreign key tables defined (0) I have tried SubSonic 2.1 and 2.2, and various MySql 5 versions. I must be doing something wrong procedurally - any help would be greatly appreciated. This has always just worked 'out-the-box' when using MS-SQL. TABLE `parent` ( `ParentId` INT(11) NOT NULL AUTO_INCREMENT, `SomeData` VARCHAR(25) DEFAULT NULL, PRIMARY KEY (`ParentId`) ) ENGINE=INNODB DEFAULT CHARSET=latin1; TABLE `child` ( `ChildId` INT(11) NOT NULL AUTO_INCREMENT, `ParentId` INT(11) NOT NULL, `SomeData` VARCHAR(25) DEFAULT NULL, PRIMARY KEY (`ChildId`), KEY `FK_child` (`ParentId`), CONSTRAINT `FK_child` FOREIGN KEY (`ParentId`) REFERENCES `parent` (`ParentId`) ) ENGINE=INNODB DEFAULT CHARSET=latin1;

    Read the article

  • Splitting a MySQL DB in two may ease server from "Too many connetions"? I don't think so

    - by Petruza
    I was requested to split a MySQL in two, it's kind of a horizontal partition, in which some rows correspond to one site, and some other correspond to another site. But they want to split it in two DBs in the same MySQL server. I'm no DB expert but I guess keeping them in the same MySQL server with the same amount of memory and processor and the same platform won't improve things. What we're trying to avoid is the "Too many connections" problem.

    Read the article

  • Is there are standard way to store a database schema outside a python app

    - by acrosman
    I am working on a small database application in Python (currently targeting 2.5 and 2.6) using sqlite3. It would be helpful to be able to provide a series of functions that could setup the database and validate that it matches the current schema. Before I reinvent the wheel, I thought I'd look around for libraries that would provide something similar. I'd love to have something akin to RoR's migrations. xml2ddl doesn't appear to be meant as a library (although it could be used that way), and more importantly doesn't support sqlite3. I'm also worried about the need to move to Python 3 one day given the lack of recent attention to xml2ddl. Are there other tools around that people are using to handle this?

    Read the article

  • How can i add encoding to the python generated CSV file

    - by user1958218
    I am following this post http://stackoverflow.com/a/9016545 and i want to know that how can i do that in Python. I don't know how can i insert BOM data in there This is my current code response = HttpResponse(content_type='text/csv') response['Content-Type'] = 'application/octet-stream' response['Content-Disposition'] = 'attachment; filename="results.csv"' writer = UnicodeWriter(response, quoting=csv.QUOTE_ALL, encoding="utf-8") I want to convert to utf -16 . BOm data is this but don't know how to insert it From here http://stackoverflow.com/a/4440143 echo "\xEF\xBB\xBF"; // UTF-8 BOM But i want it for python and utf-16 I tried opening that csv in notepad and insert \xef\xbb\xb in beginning and excel displayed that correctly. But it is also visible before first column. How can i hide that because user wont like that

    Read the article

  • Unix timestamp and mysql date: birthdate

    - by Mikk
    Hi, I have a really basic question concerning unix timestamp and mysql date. I'm trying to build a small website where users can register and fill in their birthdate. Problem is that unix starts with Jan 01 1970. Now if i calculate age for users, form dates like date('m.d.Y', $unix_from_db) and so on it will fail with users older that 40 years, right? So what would be the rigth way for doing this. Sorry, for basic question like this, but I'm inexperienced with php and mysql.

    Read the article

  • High level audio crossfading library for python

    - by tcoopman
    I am looking for a high level audio library that supports crossfading for python (and that works in linux). In fact crossfading a song and saving it is about the only thing I need. I tried pyechonest but I find it really slow. Working with multiple songs at the same time is hard on memory too (I tried to crossfade about 10 songs in one, but I got out of memory errors and my script was using 1.4Gb of memory). So now I'm looking for something else that works with python. I have no idea if there exists anything like that, if not, are there good command line tools for this, I could write a wrapper for the tool.

    Read the article

< Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >