Search Results

Search found 6172 results on 247 pages for 'limit choices to'.

Page 117/247 | < Previous Page | 113 114 115 116 117 118 119 120 121 122 123 124  | Next Page >

  • web page zooming

    - by tibin mathew
    Hi friends, I am developing a web site using php. I have placed so many adds in that, i want a code to zoom that web page up to a limit and back to also to normal stage. How can i do this??? does any one have an idea... Thanks

    Read the article

  • MySQL - How do I insert an additional where clause into this full-text search (updated)

    - by Steven
    I want to add a WHERE clause to a full text search query (to limit to past 24 hours), but wherever I insert it I get Low Level Error. Is it possible to add the clause and if so, how? Here is the code WITHOUT the where clause: $query = "SELECT *, MATCH (story_title) AGAINST ('$query' IN BOOLEAN MODE) AS Relevance FROM stories WHERE MATCH (story_title) AGAINST ('+$query' IN BOOLEAN MODE) HAVING Relevance > 0.2 ORDER BY Relevance DESC, story_time DESC;

    Read the article

  • How secure is .htaccess protected pages

    - by Steven smethurst
    Are there any known flaws with htaccess protected pages? I know they are acceptable to brute force attacks as there is no limit to the amount of times someone can attempt to login. And a user can uploaded and execute a file on the server all bets are off... Anything other .htaccess flaws?

    Read the article

  • Need MYSQL query for finding lowest score per game player

    - by Chris Barnhill
    I have a game on Facebook called Rails Across Europe. I have a Best Scores page where I show the players with the best 20 scores, which in game terms refers to the lowest winning turn. The problem is that there are a small number of players who play frequently, and their scores dominate the page. I'd like to make the scores page open to more players. So I thought that I could display the single lowest winning turn for each player instead of displaying all of the lowest winning turns for all players. The problem is that the query for this eludes me. So I hope that one of you brilliant StackOverflow folks can help me with this. I have included the relevant MYSQL table schemas below. Here are the the table relationships: player_stats contains statistics for either a game in progress or a completed game. If a game is in progress, winning_turn is zero (which means that games with a winning_turn of zero should not be included in the query). player_stats has a game_player table id reference. game_player contains data describing games currently in progress. game_player has a player table id reference. player contains data describing a person who plays the game. Here's the query I'm currently using: 'SELECT p.fb_user_id, ps.winning_turn, gp.difficulty_level, c.name as city_name, g.name as goods_name, d.cost FROM game_player as gp, player as p, player_stats as ps, demand as d, city as c, goods as g WHERE p.status = "ACTIVE" AND gp.player_id = p.id AND ps.game_player_id = gp.id AND d.id = ps.highest_demand_id AND c.id = d.city_id AND g.id = d.goods_id AND ps.winning_turn > 0 ORDER BY ps.winning_turn ASC, d.cost DESC LIMIT '.$limit.';'; Here are the relevant table schemas: -- -- Table structure for table `player_stats` -- CREATE TABLE IF NOT EXISTS `player_stats` ( `id` int(11) NOT NULL auto_increment, `game_player_id` int(11) NOT NULL, `winning_turn` int(11) NOT NULL, `highest_demand_id` int(11) NOT NULL, PRIMARY KEY (`id`), KEY `game_player_id` (`game_player_id`,`highest_demand_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=3814 ; -- -- Table structure for table `game_player` -- CREATE TABLE IF NOT EXISTS `game_player` ( `id` int(10) unsigned NOT NULL auto_increment, `game_id` int(10) unsigned NOT NULL, `player_id` int(10) unsigned NOT NULL, `player_number` int(11) NOT NULL, `funds` int(10) unsigned NOT NULL, `turn` int(10) unsigned NOT NULL, `difficulty_level` enum('STANDARD','ADVANCED','MASTER','ULTIMATE') NOT NULL, `date_last_used` datetime NOT NULL, PRIMARY KEY (`id`), KEY `game_id` (`game_id`,`player_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=3814 ; -- -- Table structure for table `player` -- CREATE TABLE IF NOT EXISTS `player` ( `id` int(11) NOT NULL auto_increment, `fb_user_id` char(255) NOT NULL, `fb_proxied_email` text NOT NULL, `first_name` char(255) NOT NULL, `last_name` char(255) NOT NULL, `birthdate` date NOT NULL, `date_registered` datetime NOT NULL, `date_last_logged_in` datetime NOT NULL, `status` enum('ACTIVE','SUSPENDED','CLOSED') NOT NULL, PRIMARY KEY (`id`), KEY `fb_user_id` (`fb_user_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=1646 ;

    Read the article

  • Is there a max size for POST parameter content?

    - by l3dx
    I'm troubleshooting a java app where XML is sent between two systems using HTTP POST and a servlet. I suspect that the problem is that the XML is growing way to big. Is it possible that this is the problem? Is there a limit? When it doesn't work, the request.getParameter("message") on the consumer side will return null. Both apps are running on tomcat

    Read the article

  • SQL Server Maximum row size

    - by DannySmurf
    Came across this error today. Wondering if anyone can tell me what it means: Cannot sort a row of size 9522, which is greater than the allowable maximum of 8094. Is that 8094 bytes? Characters? Fields? Is this a problem joining multiple tables that are exceeding some limit?

    Read the article

  • What's does "cardinality of an relationship" mean in Core Data?

    - by dontWatchMyProfile
    From the docs: If all of a managed object's relationship delete rules are Nullify, then for that object at least there is no additional work to do (you may have to consider other objects that were at the destination of the relationship—if the inverse relationship was either mandatory or had a lower limit on cardinality, then the destination object or objects might be in an invalid state). Does someone have an example of this cardinality thing? What's this good for and what's important to know about this? (sounds very important...)

    Read the article

  • Calculating percent of votes inside mysql statement.

    - by Beck
    UPDATE polls_options SET `votes`=`votes`+1, `percent`=ROUND((`votes`+1) / (SELECT voters FROM polls WHERE poll_id=? LIMIT 1) * 100,1) WHERE option_id=? AND poll_id=? Don't have table data yet, to test it properly. :) And by the way, in what type % integers should be stored in database? Thanks for the help!

    Read the article

  • iTunes RSS Feed returning max. 100 items instead of 300

    - by TheEye
    I tried the iTunes RSS generator at http://itunes.apple.com/rss/generator/ to download the newest 300 games, which gave me the RSS Feed URL http://itunes.apple.com/us/rss/newapplications/limit=300/genre=6014/xml. However, only 100 are returned, and alphabetically sorted, so the list stops at the letter E ... Did Apple restrict the amount of items one could get without updating their RSS Feed Generator? Or am I missing something?

    Read the article

  • Reading a large text file to memory in C++

    - by NoneType
    Is there a way to read a large text file (~60MB) into memory at once (like a compiler flag to increase program memory limit) ? Currently, ofstream's open function throws a segmentation fault while trying to read this file. ifstream fis; fis.open("my_large_file.txt"); // Segfaults here The file just consists of rows of the form number_1<tabspace>number_2 i.e., two numbers separated by a tabspace.

    Read the article

  • Why is Dictionary.First() so slow?

    - by Rotsor
    Not a real question because I already found out the answer, but still interesting thing. I always thought that hash table is the fastest associative container if you hash properly. However, the following code is terribly slow. It executes only about 1 million iterations and takes more than 2 minutes of time on a Core 2 CPU. The code does the following: it maintains the collection todo of items it needs to process. At each iteration it takes an item from this collection (doesn't matter which item), deletes it, processes it if it wasn't processed (possibly adding more items to process), and repeats this until there are no items to process. The culprit seems to be the Dictionary.Keys.First() operation. The question is why is it slow? Stopwatch watch = new Stopwatch(); watch.Start(); HashSet<int> processed = new HashSet<int>(); Dictionary<int, int> todo = new Dictionary<int, int>(); todo.Add(1, 1); int iterations = 0; int limit = 500000; while (todo.Count > 0) { iterations++; var key = todo.Keys.First(); var value = todo[key]; todo.Remove(key); if (!processed.Contains(key)) { processed.Add(key); // process item here if (key < limit) { todo[key + 13] = value + 1; todo[key + 7] = value + 1; } // doesn't matter much how } } Console.WriteLine("Iterations: {0}; Time: {1}.", iterations, watch.Elapsed); This results in: Iterations: 923007; Time: 00:02:09.8414388. Simply changing Dictionary to SortedDictionary yields: Iterations: 499976; Time: 00:00:00.4451514. 300 times faster while having only 2 times less iterations. The same happens in java. Used HashMap instead of Dictionary and keySet().iterator().next() instead of Keys.First().

    Read the article

  • textbox character countdown asp.net

    - by Ugene
    i have many textboxes on a form and all textboxes require to have a character limit / character countdown (50 character left of 150), what is the best way to achieve this and can anyone please provide code to implement. Much Grateful Thanks

    Read the article

  • Efficiently fetching and storing tweets from a few hundred twitter profiles?

    - by MSpreij
    The site I'm working on needs to fetch the tweets from 150-300 people, store them locally, and then list them on the front page. The profiles sit in groups. The pages will be showing the last 20 tweets (or 21-40, etc) by date, group of profiles, single profile, search, or "subject" (which is sort of a different group.. I think..) a live, context-aware tag cloud (based on the last 300 tweets of the current search, group of profiles, or single profile shown) various statistics (group stuffs, most active, etc) which depend on the type of page shown. We're expecting a fair bit of traffic. The last, similar site peaked at nearly 40K visits per day, and ran intro trouble before I started caching pages as static files, and disabling some features (some, accidently..). This was caused mostly by the fact that a page load would also fetch the last x tweets from the 3-6 profiles which had not been updated the longest.. With this new site I can fortunately use cron to fetch tweets, so that helps. I'll also be denormalizing the db a little so it needs less joins, optimize it for faster selects instead of size. Now, main question: how do I figure out which profiles to check for new tweets in an efficient manner? Some people will be tweeting more often than others, some will tweet in bursts (this happens a lot). I want to keep the front page of the site as "current" as possible. If it comes to, say, 300 profiles, and I check 5 every minute, some tweets will only appear an hour after the fact. I can check more often (up to 20K) but want to optimize this as much as possible, both to not hit the rate limit and to not run out of resources on the local server (it hit mysql's connection limit with that other site). Question 2: since cron only "runs" once a minute, I figure I have to check multiple profiles each minute - as stated, at least 5, possibly more. To try and spread it out over that minute I could have it sleep a few seconds between batches or even single profiles. But then if it takes longer than 60 seconds altogether, the script will run into itself. Is this a problem? If so, how can I avoid that? Question 3: any other tips? Readmes? URLs?

    Read the article

  • Unknown column '' in 'field list'

    - by Rixius
    I am running a sql in PHP query that is dieing with the mysql_error() of Unknown column '' in 'field list' The query: SELECT `standard` AS fee FROM `corporation_state_fee` WHERE `stateid` = '8' LIMIT 1 when I run the query in PHPmyadmin, it return the information without flagging the error

    Read the article

  • Better mode for do a select with group by

    - by Luca Romagnoli
    Hi i've wrote a query that works: SELECT `comments`.* FROM `comments` RIGHT JOIN (SELECT MAX( id ) AS id, core_id, topic_id FROM comments GROUP BY core_id, topic_id order by id desc) comm ON comm.id = comments.id LIMIT 10 I want know if is possible and how rewrite it for get better performance. thanks

    Read the article

  • MYSQL autoincrement a column or just have an integer, difference?

    - by David19801
    Hi, If I have a column, set as primary index, and set as INT. If I don't set it as auto increment and just insert random integers which are unique into it, does that slow down future queries compared to autincrementing? Does it speed things up if I run OPTIMIZE on a table with its primary and only index as INT? (assuming only 2 columns, and second column is just some INT value) (the main worry is the upper limit on the autoincrement as theres lots of adds and deletes in my table)

    Read the article

  • Efficient SQL to count an occurrence in the latest X rows

    - by pulegium
    For example I have: create table a (i int); Assume there are 10k rows. I want to count 0's in the last 20 rows. Something like: select count(*) from (select i from a limit 20) where i = 0; Is that possible to make it more efficient? Like a single SQL statement or something? PS. DB is SQLite3 if that matters at all...

    Read the article

  • Should an object be fully complete before injected as a dependency?

    - by Hans
    This is an extension of this question: http://stackoverflow.com/questions/3027082/understanding-how-to-inject-object-dependencies. Since it is a bit different, I wanted to separate them, to make it, hopefully, easier to answer. Also, this is not a real system, just a simplified example that I thought we'd all be familiar with. TIA. : DB threads: thread_id, thread_name, etc posts: post_id, thread_id, post_name, post_contents, post_date, post_user_id, etc Overview Basically I'm looking at the most maintainable way to load $post_id and have it cascade and load the other things I want to know about and I'm trying to keep the controller skinny. BUT: I'm ending up with too many dependencies to inject I'm passing in initialized but empty objects I want to limit how many parameters I am passing around I could inject $post(-many) into $thread(one<-), but on that page I'm not looking at a thread, I'm looking at a post I could combine/inject them into a new object Detail If I am injecting an object into another, is it best to have it fully created first? I'm trying to limit how many parameters I have to pass in to a page, but I end up with a circle. // 1, empty object injected via constructor $thread = new Thread; $post = new Post($thread); // $thread is just an empty object $post->load($post_id); // I could now do something like $post->get('thread_id') to get everything I want in $post // 2, complete object injected via constructor $thread = new Thread; $thread->load($thread_id); // this page would have to have passed in a $thread_id, too $post = new Post($thread); // thread is a complete object, with the data I need, like thread name $post->load($post_id); // 3, inject $post into $thread, but this makes less sense to me, since I'm looking at a post page, not a thread page $post = new Post(); $post->load($post_id); $thread = new Thread($post); $thread->load(); // would load based on the $post->get('post_id') and combine. Now I have all the data I want, but it's non-intuitive to be heirarchially Thread->Post instead of Post-with-thread-info // Or, I could inject $post into $thread, but if I'm on a post page, // having an object with a top level of Thread instead of // Post-which-contains-thread-info, makes less sense to me. // to go with example 1 class post { public function __construct(&$thread) { $this->thread=$thread; } public function load($id) { // ... here I would load all the post data based on $id // now include the thread data $this->thread->load($this->get('thread_id')); return $this; } } // I don't want to do $thread = new Thread; $post = new Post; $post->load($post_id); $thread->load($post->get('post_id')); Or, I could create a new object and inject both $post and $thread into it, but then I have object with an increasing number of dependencies.

    Read the article

< Previous Page | 113 114 115 116 117 118 119 120 121 122 123 124  | Next Page >