Search Results

Search found 35007 results on 1401 pages for 'google font api'.

Page 493/1401 | < Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >

  • App engine JPA date query

    - by waney
    Hi There. Assume I have object which represents TASK. Task have due date. How do I create query to get all tasks which are due today? Some working code like "select t from Task t where dueDate=:today" will be usefull. Thank you in advance.

    Read the article

  • How to upload images to appengine from gwt

    - by user356083
    Related question I am having similar problems to what that guy had in his. My upload server returns aredirect Specifically, I am not sure what FormPanel.SubmitCompleteEvent.getResults() returns. Sometimes, I get html of an img: <img style="cursor: -moz-zoom-in;" alt="http://<myapp>.appspot.com/servePic?blob-key=abcdef" src="http://<myapp>.appspot.com/servePic?blob-abcdef" height="1" width="1"> Sometimes I get the image data in bytes. Behavior varies on I dunno what. I get the first in development, and the second in production. Does anyone know anything about this?

    Read the article

  • Session expiry times?

    - by user246114
    Hi, I've enabled sessions on my app: // appengine-web.xml <sessions-enabled>true</sessions-enabled> they seem to work when I load different pages under my domain. If I close the browser however, looks like the session is terminated. Restarting the browser shows the last session is no longer available. That could be fine, just wondering if this is documented anywhere, so I can rely on this fact? I tried the following just to test if we can tweak it: // in web.xml <session-config> <session-timeout>10</session-timeout> </session-config> also // in my servlet getThreadLocalRequest().getSession().setMaxInactiveInterval(60 * 5); but same behavior, session data is no longer available after browser restart. I looked at the stats for my project and I see data being used for something like "_ah_SESSION" objects. Are those the sessions from above? If so, shouldn't they be cleaned since they're no longer valid? (Hopefully gae takes care of that automatically?) Thanks

    Read the article

  • How do you efficiently bulk index lookups?

    - by Liron Shapira
    I have these entity kinds: Molecule Atom MoleculeAtom Given a list(molecule_ids) whose lengths is in the hundreds, I need to get a dict of the form {molecule_id: list(atom_ids)}. Likewise, given a list(atom_ids) whose length is in the hunreds, I need to get a dict of the form {atom_id: list(molecule_ids)}. Both of these bulk lookups need to happen really fast. Right now I'm doing something like: atom_ids_by_molecule_id = {} for molecule_id in molecule_ids: moleculeatoms = MoleculeAtom.all().filter('molecule =', db.Key.from_path('molecule', molecule_id)).fetch(1000) atom_ids_by_molecule_id[molecule_id] = [ MoleculeAtom.atom.get_value_for_datastore(ma).id() for ma in moleculeatoms ] Like I said, len(molecule_ids) is in the hundreds. I need to do this kind of bulk index lookup on almost every single request, and I need it to be FAST, and right now it's too slow. Ideas: Will using a Molecule.atoms ListProperty do what I need? Consider that I am storing additional data on the MoleculeAtom node, and remember it's equally important for me to do the lookup in the molecule-atom and atom-molecule directions. Caching? I tried memcaching lists of atom IDs keyed by molecule ID, but I have tons of atoms and molecules, and the cache can't fit it. How about denormalizing the data by creating a new entity kind whose key name is a molecule ID and whose value is a list of atom IDs? The idea is, calling db.get on 500 keys is probably faster than looping through 500 fetches with filters, right?

    Read the article

  • python appengine form-posted utf8 file issue

    - by khany
    hi, i am trying to form-post a sql file that consists on many INSERTS, eg. INSERT INTO `TABLE` VALUES ('abcdé', 2759); then i use re.search to parse it and extract the fields to put into my own datastore. The problem is that, although the file contains accented characters (see the e is a é), once uploaded it loses it and either errors or stores a bytestring representation of it. Heres what i am currently using (and I have tried loads of alternatives): form = cgi.FieldStorage() uFile = form['sql'] uSql = uFile.file.read() lineX = uSql.split("\n") # to get each line and so on. has anyone got a robust way of making this work? remember i am on appengine so access to some libraries is restricted/forbidden

    Read the article

  • how to add code into a iframe

    - by Athul
    Hello how to add some html javascript code to a iframe added in our site like example i want to add a adsense to a iframe sourced some other domains (not the domain i have access) So how will i make the iframe in my site loaded with a javascript

    Read the article

  • Whats wrong with this task queue setup?

    - by Peter Farmer
    I've setup this task queue implementation on a site I host for a customer, it has a cron job which runs each morning at 2am "/admin/tasks/queue", this queues up emails to be sent out, "/admin/tasks/email", and uses cursors so as to do the queuing in small chunks. For some reason last night /admin/tasks/queue kept getting run by this code and so sent out my whole quota of emails :/. Have I done something wrong with this code? class QueueUpEmail(webapp.RequestHandler): def post(self): subscribers = Subscriber.all() subscribers.filter("verified =", True) last_cursor = memcache.get('daily_email_cursor') if last_cursor: subscribers.with_cursor(last_cursor) subs = subscribers.fetch(10) logging.debug("POST - subs count = %i" % len(subs)) if len(subs) < 10: logging.debug("POST - Less than 10 subscribers in subs") # Subscribers left is less than 10, don't reschedule the task for sub in subs: task = taskqueue.Task(url='/admin/tasks/email', params={'email': sub.emailaddress, 'day': sub.day_no}) task.add("email") memcache.delete('daily_email_cursor') else: logging.debug("POST - Greater than 10 subscibers left in subs - reschedule") # Subscribers is 10 or greater, reschedule for sub in subs: task = taskqueue.Task(url='/admin/tasks/email', params={'email': sub.emailaddress, 'day': sub.day_no}) task.add("email") cursor = subscribers.cursor() memcache.set('daily_email_cursor', cursor) task = taskqueue.Task(url="/admin/tasks/queue", params={}) task.add("queueup")

    Read the article

  • Will an Nginx as reverse proxy for Apache help on dynamic content only

    - by Saif Bechan
    I am planning to move all my static content to a CDN so on my server I only have dynamic content left. I now have Nginx set up as reverse proxy to Apache. The static request that came in where directly delivered by Nginx without having to go to Apache. In this case Nginx handled a large portion of the request and I can clearly see the necessity of Nginx. Now that I moved all the static content to another domain, is there still a need to have nginx in front of Apache. Because now all the request are by default dynamic requests and all go to Apache. Are there any other benefits of having Nginx and Apache running for only dynamic content. My dynamic content is PHP/MySQL Edit: To be clear: I now have Nginx as a reverse proxy. It delivers static and dynamic content. But I am moving my static files to a CDN. Do I then still need Nginx on my domain.

    Read the article

  • Browser: Cookie lost on refresh

    - by Nirmal
    I am experiencing a strange behaviour of my application in Chrome browser (No problem with other browsers). When I refresh a page, the cookie is being sent properly, but intermittently the browser doesn't seem to pass the cookie on some refreshes. This is how I set my cookie: $identifier = / some weird string /; $key = md5(uniqid(rand(), true)); $timeout = number_format(time(), 0, '.', '') + 43200; setcookie('fboxauth', $identifier . ":" . $key, $timeout, "/", "fbox.mysite.com", 0); This is what I am using for page headers: header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT"); header("Cache-Control: no-cache, must-revalidate"); // HTTP/1.1 header("Expires: Thu, 25 Nov 1982 08:24:00 GMT"); // Date in the past Do you see any issue here that might affect the cookie handling? Thank you for any suggestion. EDIT-01: It seems that the cookie is not being sent with some requests. This happens intermittently and I am seeing this behaviour for ALL the browsers now. Has anyone come across such situation? Is there any situation where a cookie will not be sent with the request? Thanks again, for any guideline.

    Read the article

  • What causes this retainAll exception?

    - by Joren
    java.lang.UnsupportedOperationException: This operation is not supported on Query Results at org.datanucleus.store.query.AbstractQueryResult.contains(AbstractQueryResult.java:250) at java.util.AbstractCollection.retainAll(AbstractCollection.java:369) at namespace.MyServlet.doGet(MyServlet.java:101) I'm attempting to take one list I retrieved from a datastore query, and keep only the results which are also in a list I retrieved from a list of keys. Both my lists are populated as expected, but I can't seem to user retainAll on either one of them. // List<Data> listOne = new ArrayList(query.execute(theQuery)); // DatastoreService ds = DatastoreServiceFactory.getDatastoreService(); // List<Data> listTwo = new ArrayList(ds.get(keys).values()); // listOne.retainAll(listTwo);

    Read the article

  • Optimizing tasks to reduce CPU in a trading application

    - by Joel
    Hello, I have designed a trading application that handles customers stocks investment portfolio. I am using two datastore kinds: Stocks - Contains unique stock name and its daily percent change. UserTransactions - Contains information regarding a specific purchase of a stock made by a user : the value of the purchase along with a reference to Stock for the current purchase. db.Model python modules: class Stocks (db.Model): stockname = db.StringProperty(multiline=True) dailyPercentChange=db.FloatProperty(default=1.0) class UserTransactions (db.Model): buyer = db.UserProperty() value=db.FloatProperty() stockref = db.ReferenceProperty(Stocks) Once an hour I need to update the database: update the daily percent change in Stocks and then update the value of all entities in UserTransactions that refer to that stock. The following python module iterates over all the stocks, update the dailyPercentChange property, and invoke a task to go over all UserTransactions entities which refer to the stock and update their value: Stocks.py # Iterate over all stocks in datastore for stock in Stocks.all(): # update daily percent change in datastore db.run_in_transaction(updateStockTxn, stock.key()) # create a task to update all user transactions entities referring to this stock taskqueue.add(url='/task', params={'stock_key': str(stock.key(), 'value' : self.request.get ('some_val_for_stock') }) def updateStockTxn(stock_key): #fetch the stock again - necessary to avoid concurrency updates stock = db.get(stock_key) stock.dailyPercentChange= data.get('some_val_for_stock') # I get this value from outside ... some more calculations here ... stock.put() Task.py (/task) # Amount of transaction per task amountPerCall=10 stock=db.get(self.request.get("stock_key")) # Get all user transactions which point to current stock user_transaction_query=stock.usertransactions_set cursor=self.request.get("cursor") if cursor: user_transaction_query.with_cursor(cursor) # Spawn another task if more than 10 transactions are in datastore transactions = user_transaction_query.fetch(amountPerCall) if len(transactions)==amountPerCall: taskqueue.add(url='/task', params={'stock_key': str(stock.key(), 'value' : self.request.get ('some_val_for_stock'), 'cursor': user_transaction_query.cursor() }) # Iterate over all transaction pointing to stock and update their value for transaction in transactions: db.run_in_transaction(updateUserTransactionTxn, transaction.key()) def updateUserTransactionTxn(transaction_key): #fetch the transaction again - necessary to avoid concurrency updates transaction = db.get(transaction_key) transaction.value= transaction.value* self.request.get ('some_val_for_stock') db.put(transaction) The problem: Currently the system works great, but the problem is that it is not scaling well… I have around 100 Stocks with 300 User Transactions, and I run the update every hour. In the dashboard, I see that the task.py takes around 65% of the CPU (Stock.py takes around 20%-30%) and I am using almost all of the 6.5 free CPU hours given to me by app engine. I have no problem to enable billing and pay for additional CPU, but the problem is the scaling of the system… Using 6.5 CPU hours for 100 stocks is very poor. I was wondering, given the requirements of the system as mentioned above, if there is a better and more efficient implementation (or just a small change that can help with the current implemntation) than the one presented here. Thanks!! Joel

    Read the article

  • Valid Dates in AppEngine forms (Beginner)

    - by codingJoe
    In AppEngine, I have a form that prompts a user for a date. The problem is that when the clicks enter there is an error: "Enter a Valid Date" How do I make my Form accept (for example) %d-%b-%Y as the date format? Is there a more elegant way to accomplish this? # Model and Forms class Task(db.Model): name=db.StringProperty() due=db.DateProperty() class TaskForm(djangoforms.ModelForm): class Meta: model = Task # my get function has the following. # using "now" for example. Could just as well be next Friday. tmStart = datetime.now() form = TaskForm(initial={'due': tmStart.strftime("%d-%b-%Y")}) template_values = {'form': form }

    Read the article

  • GAE datastore querying integer fields

    - by ParanoidAndroid
    I notice strange behavior when querying the GAE datastore. Under certain circumstances Filter does not work for integer fields. The following java code reproduces the problem: log.info("start experiment"); DatastoreService datastore = DatastoreServiceFactory.getDatastoreService(); int val = 777; // create and store the first entity. Entity testEntity1 = new Entity(KeyFactory.createKey("Test", "entity1")); Object value = new Integer(val); testEntity1.setProperty("field", value); datastore.put(testEntity1); // create the second entity by using BeanUtils. Test test2 = new Test(); // just a regular bean with an int field test2.setField(val); Entity testEntity2 = new Entity(KeyFactory.createKey("Test", "entity2")); Map<String, Object> description = BeanUtilsBean.getInstance().describe(test2); for(Entry<String,Object> entry:description.entrySet()){ testEntity2.setProperty(entry.getKey(), entry.getValue()); } datastore.put(testEntity2); // now try to retrieve the entities from the database... Filter equalFilter = new FilterPredicate("field", FilterOperator.EQUAL, val); Query q = new Query("Test").setFilter(equalFilter); Iterator<Entity> iter = datastore.prepare(q).asIterator(); while (iter.hasNext()) { log.info("found entity: " + iter.next().getKey()); } log.info("experiment finished"); the log looks like this: INFO: start experiment INFO: found entity: Test("entity1") INFO: experiment finished For some reason it only finds the first entity even though both entities are actually stored in the datastore and both 'field' values are 777 (I see it in the Datastore Viewer)! Why does it matter how the entity is created? I would like to use BeanUtils, because it is convenient. The same problem occurs on the local devserver and when deployed to GAE.

    Read the article

  • Subqueries on Java GAE Datastore

    - by Dmitry
    I am trying to create a database of users with connection between users (friends list). There are 2 main tables: UserEntity (main field id) and FriendEntity with fields: - initiatorId - id of user who initiated the friendship - friendId - id of user who has been invited. Now I am trying to fetch all friends of one particular user and encountered some problems with using subqueries in JDO here. Logically the query should be something like this: SQL: SELECT * FROM UserEntity WHERE EXISTS (SELECT * FORM FriendEntity WHERE (initiatorId == UserEntity.id && friendId == userId) || (friendId == UserEntity.id && initiatorId == userId)) or SELECT * FROM UserEntity WHERE userId IN (SELECT * FROM FriendEntity WHERE initiatorId == UserEntity.id) OR userId IN (SELECT * FROM FriendEntity WHERE friendId == UserEntity.id) So to replicate the last query in JDOQL, I tried to do the following: Query friendQuery = pm.newQuery(FriendEntity.class); friendQuery.setFilter("initiatorId == uidParam"); friendQuery.setResult("friendId"); Query initiatorQuery = pm.newQuery(FriendEntity.class); initiatorQuery.setFilter("friendId == uidParam"); initiatorQuery.setResult("initiatorId"); Query query = pm.newQuery(UserEntity.class); query.setFilter("initiatorQuery.contains(id) || friendQuery.contains(id)"); query.addSubquery(initiatorQuery, "List initiatorQuery", null, "String uidParam"); query.addSubquery(friendQuery, "List friendQuery", null, "String uidParam"); query.declareParameters("String uidParam"); List<UserEntity> friends = (List<UserEntity>) query.execute(userId); In result I get the following error: Unsupported method while parsing expression. Could anyone help with this query please?

    Read the article

  • Delete files from blobstore using file serving URL

    - by Arturo
    In my app (GWT on GAE) we are storing on our database the serving URL that is stored on blobstore. When user selects one of these files and clicks "delete", we need to delete the file from blobstore. This is our code, but it is not deleting the file at all: public void remove(String fileURL) { BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService(); String key = getBlobKeyFromURL(box.getImageURL()); BlobKey blobKey = new BlobKey(key); blobstoreService.delete(blobKey); } Where fileURL looks like this: http://lh6.ggpht.com/d5VC0ywISACeJRiC3zkzaZug-tPsaI_LGt93-e_ATGTCwnGLao4yTWjLVppQ And getBlobKeyFromURL() would return what is after the last "/", in this example: d5VC0ywISACeJRiC3zkzaZug-tPsaI_LGt93-e_ATGTCwnGLao4yTWjLVppQ Could you please advice? Thanks

    Read the article

  • Why is Chrome miscalculating jQuery submenu dimensions?

    - by chunkymonkey
    I'm trying to implement this dropdown menu with flyouts: http://jsfiddle.net/chunkymonkey/fr6x4/ In Chrome certain categories can be expanded to show their subcategories while others show nothing when opened up. For example: Alternative Rock can be expanded to show its multiple subcategories . . . BUT . . . World Music, which has as many subcategories, shows no subcategories when expanded. (SCREENSHOT: http://i.imgur.com/0WorR.jpg) I thought I had tracked this problem down to a problem with they way the dimensions of the dropdown elements are calculated in the original code: First change: - var newLeftVal = - ($('.fg-menu-current').parents('ul').size() - 1) * 180; + var newLeftVal = - ($('.fg-menu-current').parents('ul').size() - 1) * container.width(); Second change: Remove: var checkMenuHeight = function(el) { if (el.height() > options.maxHeight) { el.addClass('fg-menu-scroll') }; el.css({ height: options.maxHeight }); }; Add: var checkMenuHeight = function(el) { var max_height = options.maxHeight - breadcrumb.getTotalHeight(); if (el.height() > max_height) { el.addClass('fg-menu-scroll'); el.height(max_height); topList.height(max_height); } else { if (topList.height() < el.height()) { topList.height(el.height()); } } }; But it's still not working only on Chrome (version 8, Windows & Mac) (not sure why Chrome is different).

    Read the article

< Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >