Search Results

Search found 15449 results on 618 pages for 'python signal'.

Page 422/618 | < Previous Page | 418 419 420 421 422 423 424 425 426 427 428 429  | Next Page >

  • HTML5 video element non-seekable when using Django development server

    - by Ory Band
    Hey everyone. I've got a Django app serving a webpage with an HTML5 element. There's a wierd "feature", turning the video element to be non-seekable: video.seekable returns a timeRanges object with length=0, whereas it should be length=1. This means I can't edit the video. JavaScript can't do anything either. The thing is, when I upload the problematic webpage, statically - no Django, just plain HTML/JS/CSS - to my website for testing, it works fine - length=1. However, if I try to serve the same static page on my Django dev server still gives the same problem. I am using Django's static serving for dev/debug purposes - Do you have any idea what is causing this, or how can I fix it? Thanks.

    Read the article

  • related to list and file handling?

    - by kaushik
    i have file with contents in list form such as [1,'ab','fgf','ssd'] [2,'eb','ghf','hhsd'] [3,'ag','rtf','ssfdd'] i want to read that file line by line using f.readline and assign thn to a list so as to use it is the prog as a list for using list properties tried like k=[ ] k=f.readline() print k[1] i xpected a result to show 2nd element in the list in first line but it showed the first bit and gave o/p as '1' how to get the xpected output.. please suggest

    Read the article

  • Converting Numpy Lstsq residual value to R^2

    - by whatnick
    I am performing a least squares regression as below (univariate). I would like to express the significance of the result in terms of R^2. Numpy returns a value of unscaled residual, what would be a sensible way of normalizing this. field_clean,back_clean = rid_zeros(backscatter,field_data) num_vals = len(field_clean) x = field_clean[:,row:row+1] y = 10*log10(back_clean) A = hstack([x, ones((num_vals,1))]) soln = lstsq(A, y ) m, c = soln [0] residues = soln [1] print residues

    Read the article

  • How to quickly parse a list of strings

    - by math
    If I want to split a list of words separated by a delimiter character, I can use >>> 'abc,foo,bar'.split(',') ['abc', 'foo', 'bar'] But how to easily and quickly do the same thing if I also want to handle quoted-strings which can contain the delimiter character ? In: 'abc,"a string, with a comma","another, one"' Out: ['abc', 'a string, with a comma', 'another, one'] Related question: How can i parse a comma delimited string into a list (caveat)?

    Read the article

  • sip.conf configuration file - add new line to each record

    - by Flukey
    I have a sip configuration file which looks like this: [1664] username=1664 mailbox=1664@8360 host=192.168.254.3 type=friend subscribemwi=no [1679] username=1679 mailbox=1679@8360 host=192.168.254.3 type=friend subscribemwi=no [1700] username=1700 mailbox=1700@8360 host=192.168.254.3 type=friend subscribemwi=no [1701] username=1701 mailbox=1701@8360 host=192.168.254.3 type=friend subscribemwi=no For each record I need to add another line (vmxten for each record) for example the above becomes: [1664] username=1664 mailbox=1664@8360 host=192.168.254.3 type=friend subscribemwi=no vmexten=1664 [1679] username=1679 mailbox=1679@8360 host=192.168.254.3 type=friend subscribemwi=no vmexten=1679 [1700] username=1700 mailbox=1700@8360 host=192.168.254.3 type=friend subscribemwi=no vmexten=1700 [1701] username=1701 mailbox=1701@8360 host=192.168.254.3 type=friend subscribemwi=no vmexten=1701 What would you say would be the quickest way to do this? there are hundreds of records in the file, therefore modifying all of the records by hand would take a long time. Would you use Regex? Would you use sed? I'm interested to know how you would approach the problem. Thanks

    Read the article

  • Get parent function

    - by Morgoth
    Is there a way to find what function called the current function? So for example: def first(): second() def second(): # print out here what function called this one Any ideas?

    Read the article

  • Running the same code for get(self) as post(self)

    - by Peter Farmer
    Its been mentioned in other answers about getting the same code running for both the def get(self) and the def post(self) for any given request. I was wondering what techniques people use, I was thinking of: class ListSubs(webapp.RequestHandler): def get(self): self._run() def post(self): self._run() def _run(self): self.response.out.write("This works nicely!")

    Read the article

  • Detecting and interacting with long running process

    - by jacquesb
    I want a script to start and interact with a long running process. The process is started first time the script is executed, after that the script can be executed repeatedly, but will detect that the process is already running. The script should be able to interact with the process. I would like this to work on Unix and Windows. I am unsure how I do this. Specifically how do I detect if the process is already running and open a pipe to it? Should I use sockets (e.g. registering the server process on a known port and then check if it responds) or should I use "named pipes"? Or is there some easier way?

    Read the article

  • Best way to order by columns in relationships?

    - by Timmy
    im using sqlalchemy, and i have a few polymorphic tables, and i want to sort by a column in one of the relationship. i have tables a,b,c,d, with relationship a to b, b to c, c to d. a to b is one-to-many b to c and c to d are one-to-one, but polymorphic. given an item in table a, i will have a list of items b, c, d (all one to one). how do i use sqlalchemy to sort them by a column in table d?

    Read the article

  • Debugging (displaying) SQL command sent to the db by SQLAlchemy

    - by morpheous
    I have an ORM class called Person, which wraps around a person table: After setting up the connection to the db etc, I run the ff statement. people = session.query(Person).all() The person table does not contain any data (as yet), so when I print the variable people, I get an empty list. I renamed the table referred to in my ORM class People, to people_foo (which does not exist). I then run the script again. I was surprised that no exception was thrown when attempting to access a table that does not exist. I therefore have the following 2 questions: How may I setup SQLAlchemy so that it propagates db errors back to the script? How may I view (i.e. print) the SQL that is being sent to the db engine If it helps, I am using PostgreSQL as the db

    Read the article

  • Dynamically customize django admin columns ?

    - by tomjerry
    Is it possible to let the users choose / change dynamically the columns displayed in a object list in Django administration ? Things can surely be implemented "from scratch" by modifying the 'change_list.html' template but I was wondering if somebody has already had the same problem and/or if any django-pluggin can do that. Thanks in advance,

    Read the article

  • Pickled my dictionary from ZODB but i got a less in size one?

    - by Someone Someoneelse
    I use ZODB and i want to copy my 'database_1.fs' file to another 'database_2.fs', so I opened the root dictionary of that 'database_1.fs' and I (pickle.dump) it in a text file. Then I (pickle.load) it in a dictionary-variable, in the end I update the root dictionary of the other 'database_2.fs' with the dictionary-variable. It works, but I wonder why the size of the 'database_1.fs' not equal to the size of the other 'database_2.fs'. They are still copies of each other. def openstorage(store): #opens the database data={} data['file']=filestorage data['db']=DB(data['file']) data['conn']=data['db'].open() data['root']=data['conn'].root() return data def getroot(dicty): return dicty['root'] def closestorage(dicty): #close the database after Saving transaction.commit() dicty['file'].close() dicty['db'].close() dicty['conn'].close() transaction.get().abort() then that's what i do:- import pickle loc1='G:\\database_1.fs' op1=openstorage(loc1) root1=getroot(op1) loc2='G:database_2.fs' op2=openstorage(loc2) root2=getroot(op2) >>> len(root1) 215 >>> len(root2) 0 pickle.dump( root1, open( "save.txt", "wb" )) item=pickle.load( open( "save.txt", "rb" ) ) #now item is a dictionary root2.update(item) closestorage(op1) closestorage(op2) #after I open both of the databases #I get the same keys in both databases #But `database_2.fs` is smaller that `database_2.fs` in size I mean. >>> len(root2)==len(root1)==215 #they have the same keys True Note: (1) there are persistent dictionaries and lists in the original database_1.fs (2) both of them have the same length and the same indexes.

    Read the article

  • Prevent web2py from caching ?

    - by Joe
    Hi ! I'm working with web2py and for some reason web2py seems to fail to notice when code has changed in certain cases. I can't really narrow it down, but from time to time changes in the code are not reflected, web2py obviously has the old version cached somewhere. The only thing that helps is quitting web2py and restarting it (i'm using the internal server). Any hints ? Thank you !

    Read the article

  • Praw (Redditt API) How to retrieve replies to a comment past 10 levels deep

    - by jpreed00
    Ok, so I've written some code that, for all intents and purposes, should work: def checkComments(comments): for comment in comments: print comment.body checkComments(comment.replies) def processSub(sub): sub.replace_more_comments(limit=None, threshold=0) checkComments(sub.comments) #login and subreddit init stuff here subs = mysubreddit.get_hot(limit=50) for sub in subs: processSub(sub) However, given a submission that has 50 nested replies like so: root comment -> 1st reply -> 2nd reply -> 3rd reply ... -> 50th reply The above code only prints: root comment 1st reply 2nd reply 3rd reply 4th reply 5th reply 6th reply 7th reply 8th reply 9th reply Any idea how I can get the remaining 41 levels of replies? Or is this a praw limitation?

    Read the article

  • What do I do with a Concrete Syntax Tree?

    - by Cap
    I'm using pyPEG to create a parse tree for a simple grammar. The tree is represented using lists and tuples. Here's an example: [('command', [('directives', [('directive', [('name', 'retrieve')]), ('directive', [('name', 'commit')])]), ('filename', [('name', 'f30502')])])] My question is what do I do with it at this point? I know a lot depends on what I am trying to do, but I haven't been able to find much about consuming/using parse trees, only creating them. Does anyone have any pointers to references I might use? Thanks for your help.

    Read the article

  • element-wise lookup on one ndarray to another ndarray of different shapes

    - by fahhean
    Hi, I am new to numpy. Am wonder is there a way to do lookup of two ndarray of different shapes? for example, i have 2 ndarrays as below: X = array([[0, 3, 6], [3, 3, 3], [6, 0, 3]]) Y = array([[0, 100], [3, 500], [6, 800]]) and would like to lookup each element of X in Y, then be able to return the second column of Y: Z = array([[100, 500, 800], [500, 500, 500], [800, 100, 500]]) thanks, fahhean

    Read the article

  • Get node name with minidom

    - by Alex
    Is it possible to get the name of a node using minidom? for example i have a node: <heading><![CDATA[5 year]]></heading> what i'm trying to do is store the value heading so that i can use it as a key in a dictionary, the closest i can get is something like [<DOM Element: heading at 0x11e6d28>] i'm sure i'm overlooking something very simple here, thanks!

    Read the article

  • Can pydoc/help() hide the documentation for inherited class methods and attributes?

    - by EOL
    When declaring a class that inherits from a specific class: class C(dict): added_attribute = 0 the documentation for class C lists all the methods of dict (either through help(C) or pydoc). Is there a way to hide the inherited methods from the automatically generated documentation (the documentation string can refer to the base class, for non-overwritten methods)? or is it impossible? This would be useful: pydoc lists the functions defined in a module after its classes. Thus, when the classes have a very long documentation, a lot of less than useful information is printed before the new functions provided by the module are presented, which makes the documentation harder to exploit (you have to skip all the documentation for the inherited methods until you reach something specific to the module being documented).

    Read the article

  • twitter api post rate limit

    - by Xavier
    Does anyone know Twitter's rate limit on posting? Looking at their web page they claimed to not have one but I get an exception thrown if my program posts too fast... Any help is appreciated.

    Read the article

  • if else-if making code look ugly any cleaner solution?

    - by Vishal
    I have around 20 functions (is_func1, is_fucn2, is_func3...) returning boolean I assume there is only one function which returns true and I want that! I am doing: if is_func1(param1, param2): # I pass 1 to following abc(1) # I pass 1 some_list.append(1) elif is_func2(param1, param2): # I pass 2 to following abc(2) # I pass 1 some_list.append(2) ... . . elif is_func20(param1, param2): ... Please note: param1 and param2 are different for each, abc and some_list take parameters depending on the function. The code looks big and there is repetition in calling abc and some_list, I can pull this login in a function! but is there any other cleaner solution? I can think of putting functions in a data structure and loop to call them.

    Read the article

  • sql select from a large number of IDs

    - by Claudiu
    I have a table, Foo. I run a query on Foo to get the ids from a subset of Foo. I then want to run a more complicated set of queries, but only on those IDs. Is there an efficient way to do this? The best I can think of is creating a query such as: SELECT ... --complicated stuff WHERE ... --more stuff AND id IN (1, 2, 3, 9, 413, 4324, ..., 939393) That is, I construct a huge "IN" clause. Is this efficient? Is there a more efficient way of doing this, or is the only way to JOIN with the inital query that gets the IDs? If it helps, I'm using SQLObject to connect to a PostgreSQL database, and I have access to the cursor that executed the query to get all the IDs.

    Read the article

< Previous Page | 418 419 420 421 422 423 424 425 426 427 428 429  | Next Page >