Search Results

Search found 11153 results on 447 pages for 'count zero'.

Page 300/447 | < Previous Page | 296 297 298 299 300 301 302 303 304 305 306 307  | Next Page >

  • Best non-development book for software developers

    - by Dima Malenko
    What is the best non software development related book that you think each software developer should read? Note, there is a similar, poll-style question here: What non-programming books should programmers read? Update: Peopleware is a great book, must read, no doubt. But it is about software development so does not count. Update: We ended up suggesting more than one book and that's great! Below is summary (with links to Amazon) of the books you should consider for your reading list. The Design of Everyday Things by Donald Norman Getting Things Done by David Allen Godel, Escher, Bach by Douglas R. Hofstadter The Goal and It's Not Luck by Eliyahu M. Goldratt Here Comes Everybody by Clay Shirky ...to be continued.

    Read the article

  • delphi idhttp post related question

    - by paul
    hello All im new to delphi. and also almost new to programming world. i was made some simple post software which using idhttp module. but when execute it , it not correctly working. this simple program is check for my account status. if account login successfully it return some source code which include 'top.location =' in source, and if login failed it return not included 'top.location =' inside account.txt is follow first and third account was alived account but only first account can check, after first account other account can't check i have no idea what wrong with it ph896011 pk1089 fsadfasdf dddddss ph896011 pk1089 following is source of delphi if any one help me much apprecated! unit Unit1; interface uses Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms, Dialogs, StdCtrls, IdBaseComponent, IdComponent, IdTCPConnection, IdTCPClient, IdHTTP, IdCookieManager, ExtCtrls; type TForm1 = class(TForm) Button1: TButton; IdHTTP1: TIdHTTP; Memo1: TMemo; IdCookieManager1: TIdCookieManager; lstAcct: TListBox; result: TLabel; Edit1: TEdit; Timer1: TTimer; procedure Button1Click(Sender: TObject); //procedure FormCreate(Sender: TObject); //procedure FormClose(Sender: TObject; var Action: TCloseAction); private { Private declarations } public AccList: TStringList; IdCookie: TIdCookieManager; CookieList: TList; StartCnt: Integer; InputCnt: Integer; WordList: TStringList; WordNoList: TStringList; WordCntList: TStringList; StartTime: TDateTime; end; var Form1: TForm1; implementation {$R *.dfm} procedure TForm1.Button1Click(Sender: TObject); var i: Integer; //temp: String; lsttemp: TStringList; sl : tstringlist; //userId,userPass: string; begin InputCnt:= 0; WordList := TStringList.Create; CookieList := TList.create; IdCookie := TIdCookieManager.Create(self); if FileExists(ExtractFilePath(Application.ExeName) + 'account.txt') then WordList.LoadFromFile(ExtractFilePath(Application.ExeName) + 'account.txt'); WordNoList:= TStringList.Create; WordCntList := TStringList.Create; lsttemp := TStringList.create; sl :=Tstringlist.Create; try try for i := 0 to WordList.Count -1 do begin ExtractStrings([' '], [' '], pchar(WordList[i]), lsttemp); WordNoList.add(lsttemp[0]); //ShowMessage(lsttemp[0]); WordCntList.add(lsttemp[1]); //ShowMessage(lsttemp[1]); sl.Add('ID='+ lsttemp[0]); sl.add('PWD=' + lsttemp[1]); sl.add('SECCHK=0'); IdHTTP1.HandleRedirects := True; IdHTTP1.Request.ContentType := 'application/x-www-form-urlencoded'; memo1.Text:=idhttp1.Post('http://user.buddybuddy.co.kr/Login/Login.asp',sl); if pos('top.location =',Memo1.Text)> 0 then begin application.ProcessMessages; ShowMessage('Alive Acc!'); //result.Caption := 'alive acc' ; sleep(1000); Edit1.Text := 'alive acc'; lsttemp.Clear; Memo1.Text := ''; //memo1.Text := IdHTTP1.Get('https://user.buddybuddy.co.kr/Login/Logout.asp'); Sleep(1000); end; if pos('top.location =', memo1.Text) <> 1 then begin application.ProcessMessages; ShowMessage('bad'); Edit1.Text := 'bad'; //edit1.Text := 'bad'; lsttemp.Clear; memo1.Text := ''; sleep(1000) ; end; Edit1.Text := ''; end; finally lsttemp.free; end; StartCnt := lstAcct.items.Count; StartTime := Now; finally sl.Free; end; end; end.

    Read the article

  • What is a good layout for a somewhat advanced home network and storage solution?

    - by Shaun
    My home network/storage needs are changing and I am searching for some opinions and starting points on what a good network/storage layout would be that can serve my needs for a few years into the future. I think I have a decent starting point for equipment, but I am also willing to invest fairly heavily in a solution that can last me for a while. I am a bit of a tech nerd and I have a moderate tolerance for setup of the solution. I would prefer if maintenance of the system is somewhat low once it is setup, but I am willing to accept some tradeoffs. Existing equipment: Router - Netgear WNDR3700 (gigabit) Router - DLink Gamerlounge DGL-4300 (gigabit) Switch - 16 port Trendnet green switch (gigabit) Switch - 5 port Trendnet green (gigabit) Computer - i7-950 office computer (gigabit ethernet) Computer - Q6600 quad core media center, hooked up to TV, records shows (gigabit ethernet) Computer - Acer 1810T ultraportable laptop (gigabit and N ethernet) NAS - Intel SS4200-E (gigabit) External hard drive - 2TB WD Green drive (esata) All kinds of miscellaneous network connected TV, Bluray, Verizon network extender, HDhomerun TV tuners, etc. Requirements: -Robust backup solution for a growing collection of huge family picture files and personal files, around 1.5TB. (Including offsite backup) -Central location for all user's files, while also keeping them secure from each other. -Storage for terabytes of movie backups and recorded TV, and access to them from all computers (maybe around 4TB eventually) -Possibility to host files to friends and family easily Nice to have: -Backup of terabytes of movie backups Intriguing possibilities: -Capability to have users' Windows desktops and files look the same from all network computers I am not sure if the new Windows Home Server 2011 would fit into this well, if I need a domain server, how best to organize my backups, or how to most effectively use RAID. Currently I am simply backing up all computers to a RAID 1 on the NAS box, which I was thinking could prevent a situation where I reach for a backup and find that the disk is corrupt. One possibility that I am thinking about now is simply using my media center PC with a huge RAID of hard drives on which all files are stored. Pseudo-backup of all files would be present because of the RAID, but important files would also be backed up off site via carrying hard drives to work. But what if corruption seeps into the files and the corrupted data is then backed up? Does RAID protect against this? I really want to take next to zero risks with the irreplaceable files. I can handle some degree of risk with the movies and other files. I'm looking for critiques on this idea as well as other possibilities. To summarize, my goal is high functionality, media capable, and robust backup of irreplaceable files.

    Read the article

  • LEFT OUTER JOIN SUM doubles problem

    - by Michael
    Hi I've got two tables: Table: Shopping shop_id shop_name shop_time 1 Brian 40 2 Brian 31 3 Tom 20 4 Brian 30 Table:bananas banana_id banana_amount banana_person 1 1 Brian 2 1 Brian I now want it to print: Name: Tom | Time: 20 | Bananas: 0 Name: Brian | Time: 101 | Bananas: 2 I used this code: $result = dbquery("SELECT tz.*, tt.*, SUM(shop_time) as shoptime, count(banana_amount) as bananas FROM shopping tt LEFT OUTER JOIN bananas tz ON tt.shop_name=tz.banana_person GROUP by banana_person LIMIT 40 "); while ($data5 = dbarray($result)) { echo 'Name: '.$data5["shop_name"].' | Time: '.$data5["shoptime"].' | Bananas: '.$data5["bananas"].'<br>'; } The problem is that I get this instead: Name: Tom | Time: 20 | Bananas: 0 Name: Brian | Time: 202 | Bananas: 6 I just don't know how to get around this.

    Read the article

  • Hard Disk DRDY error: is it a crash

    - by pranjal
    I am using IBM Thinkpad, 1.7GHz, 512 RAM with Linux Mint 9 installed. I have two partitions in addition to root. One of the partitions became read-only yesterday, after which I rebooted my system. It is extremely slow along with DRDY Error : Is my Hard disk crashed ? Error Log while booting. Differences between boot sector and its backup. failed command : READ DMA BMDMA : stat 0X25 ata 1.00 : status : { DRDY ERR } ata 1.00 : status :{ UNC } Buffer I/O error on logical device, logical block 65467 smartctl output for the partition: mint mint # smartctl -a /dev/sda1 smartctl version 5.38 [i686-pc-linux-gnu] Copyright (C) 2002-8 Bruce Allen Home page is http://smartmontools.sourceforge.net/ === START OF INFORMATION SECTION === Device Model: TOSHIBA MK4026GAX RoHS Serial Number: X5LY1623T Firmware Version: PA107E User Capacity: 40,007,761,920 bytes Device is: Not in smartctl database [for details use: -P showall] ATA Version is: 6 ATA Standard is: Exact ATA specification draft version not indicated Local Time is: Thu Feb 17 06:48:25 2011 UTC SMART support is: Available - device has SMART capability. SMART support is: Enabled === START OF READ SMART DATA SECTION === SMART overall-health self-assessment test result: PASSED General SMART Values: Offline data collection status: (0x84) Offline data collection activity was suspended by an interrupting command from host. Auto Offline Data Collection: Enabled. Self-test execution status: ( 0) The previous self-test routine completed without error or no self-test has ever been run. Total time to complete Offline data collection: ( 153) seconds. Offline data collection capabilities: (0x1b) SMART execute Offline immediate. Auto Offline data collection on/off support. Suspend Offline collection upon new command. Offline surface scan supported. Self-test supported. No Conveyance Self-test supported. No Selective Self-test supported. SMART capabilities: (0x0003) Saves SMART data before entering power-saving mode. Supports SMART auto save timer. Error logging capability: (0x01) Error logging supported. No General Purpose Logging support. Short self-test routine recommended polling time: ( 2) minutes. Extended self-test routine recommended polling time: ( 30) minutes. SMART Attributes Data Structure revision number: 16 Vendor Specific SMART Attributes with Thresholds: ID# ATTRIBUTE_NAME FLAG VALUE WORST THRESH TYPE UPDATED WHEN_FAILED RAW_VALUE 1 Raw_Read_Error_Rate 0x000b 100 100 050 Pre-fail Always - 0 2 Throughput_Performance 0x0005 100 100 050 Pre-fail Offline - 0 3 Spin_Up_Time 0x0027 100 100 001 Pre-fail Always - 310 4 Start_Stop_Count 0x0032 100 100 000 Old_age Always - 3968 5 Reallocated_Sector_Ct 0x0033 100 100 050 Pre-fail Always - 40 7 Seek_Error_Rate 0x000b 100 100 050 Pre-fail Always - 0 8 Seek_Time_Performance 0x0005 100 100 050 Pre-fail Offline - 0 9 Power_On_Hours 0x0032 082 082 000 Old_age Always - 7257 10 Spin_Retry_Count 0x0033 179 100 030 Pre-fail Always - 0 12 Power_Cycle_Count 0x0032 100 100 000 Old_age Always - 3484 192 Power-Off_Retract_Count 0x0032 100 100 000 Old_age Always - 489 193 Load_Cycle_Count 0x0032 064 064 000 Old_age Always - 367150 194 Temperature_Celsius 0x0022 100 100 000 Old_age Always - 36 (Lifetime Min/Max 14/57) 196 Reallocated_Event_Count 0x0032 100 100 000 Old_age Always - 33 197 Current_Pending_Sector 0x0032 100 100 000 Old_age Always - 82 198 Offline_Uncorrectable 0x0030 100 100 000 Old_age Offline - 1 199 UDMA_CRC_Error_Count 0x0032 200 253 000 Old_age Always - 0 220 Disk_Shift 0x0002 100 100 000 Old_age Always - 101 222 Loaded_Hours 0x0032 085 085 000 Old_age Always - 6146 223 Load_Retry_Count 0x0032 100 100 000 Old_age Always - 0 224 Load_Friction 0x0022 100 100 000 Old_age Always - 0 226 Load-in_Time 0x0026 100 100 000 Old_age Always - 227 240 Head_Flying_Hours 0x0001 100 100 001 Pre-fail Offline - 0 SMART Error Log Version: 1 ATA Error Count: 2371 (device log contains only the most recent five errors) CR = Command Register [HEX] FR = Features Register [HEX] SC = Sector Count Register [HEX] SN = Sector Number Register [HEX] CL = Cylinder Low Register [HEX] CH = Cylinder High Register [HEX] DH = Device/Head Register [HEX] DC = Device Command Register [HEX] ER = Error register [HEX] ST = Status register [HEX] Powered_Up_Time is measured from power on, and printed as DDd+hh:mm:SS.sss where DD=days, hh=hours, mm=minutes, SS=sec, and sss=millisec. It "wraps" after 49.710 days. Error 2371 occurred at disk power-on lifetime: 7256 hours (302 days + 8 hours) When the command that caused the error occurred, the device was active or idle. After command completion occurred, registers were: ER ST SC SN CL CH DH -- -- -- -- -- -- -- 40 51 05 1a 1b 00 e0 Error: UNC 5 sectors at LBA = 0x00001b1a = 6938 Commands leading to the command that caused the error were: CR FR SC SN CL CH DH DC Powered_Up_Time Command/Feature_Name -- -- -- -- -- -- -- -- ---------------- -------------------- c8 00 05 1a 1b 00 e0 00 00:03:10.061 READ DMA f8 00 00 00 00 00 e0 00 00:03:10.061 READ NATIVE MAX ADDRESS ec 00 00 00 00 00 a0 02 00:03:10.053 IDENTIFY DEVICE ef 03 45 00 00 00 a0 02 00:03:10.053 SET FEATURES [Set transfer mode] f8 00 00 00 00 00 e0 00 00:03:10.053 READ NATIVE MAX ADDRESS Error 2370 occurred at disk power-on lifetime: 7256 hours (302 days + 8 hours) When the command that caused the error occurred, the device was active or idle. After command completion occurred, registers were: ER ST SC SN CL CH DH -- -- -- -- -- -- -- 40 51 05 1a 1b 00 e0 Error: UNC 5 sectors at LBA = 0x00001b1a = 6938 Commands leading to the command that caused the error were: CR FR SC SN CL CH DH DC Powered_Up_Time Command/Feature_Name -- -- -- -- -- -- -- -- ---------------- -------------------- c8 00 05 1a 1b 00 e0 00 00:03:03.328 READ DMA f8 00 00 00 00 00 e0 00 00:03:03.327 READ NATIVE MAX ADDRESS ec 00 00 00 00 00 a0 02 00:03:03.320 IDENTIFY DEVICE ef 03 45 00 00 00 a0 02 00:03:03.319 SET FEATURES [Set transfer mode] f8 00 00 00 00 00 e0 00 00:03:03.319 READ NATIVE MAX ADDRESS Error 2369 occurred at disk power-on lifetime: 7256 hours (302 days + 8 hours) When the command that caused the error occurred, the device was active or idle. After command completion occurred, registers were: ER ST SC SN CL CH DH -- -- -- -- -- -- -- 40 51 05 1a 1b 00 e0 Error: UNC 5 sectors at LBA = 0x00001b1a = 6938 Commands leading to the command that caused the error were: CR FR SC SN CL CH DH DC Powered_Up_Time Command/Feature_Name -- -- -- -- -- -- -- -- ---------------- -------------------- c8 00 05 1a 1b 00 e0 00 00:02:56.582 READ DMA f8 00 00 00 00 00 e0 00 00:02:56.582 READ NATIVE MAX ADDRESS ec 00 00 00 00 00 a0 02 00:02:56.574 IDENTIFY DEVICE ef 03 45 00 00 00 a0 02 00:02:56.574 SET FEATURES [Set transfer mode] f8 00 00 00 00 00 e0 00 00:02:56.574 READ NATIVE MAX ADDRESS Error 2368 occurred at disk power-on lifetime: 7256 hours (302 days + 8 hours) When the command that caused the error occurred, the device was active or idle. After command completion occurred, registers were: ER ST SC SN CL CH DH -- -- -- -- -- -- -- 40 51 05 1a 1b 00 e0 Error: UNC 5 sectors at LBA = 0x00001b1a = 6938 Commands leading to the command that caused the error were: CR FR SC SN CL CH DH DC Powered_Up_Time Command/Feature_Name -- -- -- -- -- -- -- -- ---------------- -------------------- c8 00 05 1a 1b 00 e0 00 00:02:49.809 READ DMA f8 00 00 00 00 00 e0 00 00:02:49.809 READ NATIVE MAX ADDRESS ec 00 00 00 00 00 a0 02 00:02:49.801 IDENTIFY DEVICE ef 03 45 00 00 00 a0 02 00:02:49.801 SET FEATURES [Set transfer mode] f8 00 00 00 00 00 e0 00 00:02:49.801 READ NATIVE MAX ADDRESS Error 2367 occurred at disk power-on lifetime: 7256 hours (302 days + 8 hours) When the command that caused the error occurred, the device was active or idle. After command completion occurred, registers were: ER ST SC SN CL CH DH -- -- -- -- -- -- -- 40 51 05 1a 1b 00 e0 Error: UNC 5 sectors at LBA = 0x00001b1a = 6938 Commands leading to the command that caused the error were: CR FR SC SN CL CH DH DC Powered_Up_Time Command/Feature_Name -- -- -- -- -- -- -- -- ---------------- -------------------- c8 00 05 1a 1b 00 e0 00 00:02:43.056 READ DMA f8 00 00 00 00 00 e0 00 00:02:43.056 READ NATIVE MAX ADDRESS ec 00 00 00 00 00 a0 02 00:02:43.048 IDENTIFY DEVICE ef 03 45 00 00 00 a0 02 00:02:43.048 SET FEATURES [Set transfer mode] f8 00 00 00 00 00 e0 00 00:02:43.047 READ NATIVE MAX ADDRESS SMART Self-test log structure revision number 1 No self-tests have been logged. [To run self-tests, use: smartctl -t] Device does not support Selective Self Tests/Logging Do I need to get a new Hard Disk my PC ?

    Read the article

  • dynamic insert php mysql and preformance

    - by Ross
    I have a folder/array of images, it may be 1, maximum of 12. What I need to do is dynamically add them so the images are added to an images table. At the moment I have $directory = "portfolio_images/$id/Thumbs/"; $images = glob("" . $directory . "*.jpg"); for ( $i= 0; $i <= count($images); $i += 1) { mysql_query("INSERT INTO project_images (image_name, project_id)VALUES ('$images[0]', '$id')") or die(mysql_error()); } this is fine but it does not feel right, how is this for performance? Is there a better way? The maximum number of images is only ever going to be 12. Thanks, Ross

    Read the article

  • Update {node_counter} table programatically in drupal

    - by wiifm69
    I currently use the statistics module (core) on my Drupal 6 install. This increments a count in the {node_counter} table every time the node is viewed, and this works. My question is - can I programmatically increment this counter as well? I am looking to achieve this when users interact with content created from views (say click a lightbox), so being able to update the table with AJAX would be ideal. I have done a quick search on d.o and there doesn't appear to be any modules that stick out straight away. Does anyone have any experience with this?

    Read the article

  • Creating an Excel 2007 bubble chart with date values in axis

    - by Shadowfoot
    I'm trying to create a graph showing the duration of issue resolution. I believe a bubble chart in excel will show what I want but I can't manage to get it working correctly. For each date I have a number of days (duration) and a number of issues (magnitude). Most dates have a duration of 1 with a large magnitude, and I want to avoid the outliers dominating the chart. e.g. 1-Feb, 1, 15 1-Feb, 2, 10 1-Feb, 9, 1 2-Feb, 1, 11 2-Feb, 2, 14 2-Feb, 6, 2 2-Feb, 18, 1 etc. I want the data in this example to give me 2 columns of bubbles. When I try to get excel to create the chart I can't get the date to appear at on the x axis; I get a count (representing the row of the data) instead, and as each row is a separate column, the values don't line up for the date. Can this be done in Excel 2007 without using VBA?

    Read the article

  • Ignoring old multiple asynchronous ajax requests

    - by Travis
    I've got a custom javascript autocomplete script that hits the server with multiple asynchronous ajax requests. (Everytime a key gets pressed.) I've noticed that sometimes an earlier ajax request will be returned after a later requests, which messes things up. The way I handle this now is I have a counter that increments for each ajax request. Requests that come back with a lower count get ignored. I'm wondering: Is this proper? Or is there a better way of dealing with this issue? Thanks in advance, Travis

    Read the article

  • Trouble getting search results using MGTwitterEngine

    - by cannyboy
    I'm using a version of Matt Gemmell's MGTwitterEngine, and I'm trying to get some results from the getSearchResultsForQuery method. // do a search [_engine getSearchResultsForQuery:@"#quote"]; // delegate method for handling result - (void)searchResultsReceived:(NSArray *)searchResults forRequest:(NSString *)connectionIdentifier { if ([searchResults count] > 0) { NSDictionary *singleResult = [searchResults objectAtIndex:1]; NSString *text = [singleResult valueForKey:@"text"]; NSLog(@"Text at Index one: \n %@", text); } } However, I never appear to get the result. In the console, I get: Request 7E4C3097-88D6-45F1-90D2-AD8205FBAAC5 failed with error: Error Domain=HTTP Code=400 "Operation could not be completed. (HTTP error 400.)" Is there a way around this? Am I implementing the delegate method right? (Also, I had difficulty installing the YAJL stuff for the engine, and wonder if that has something to do with it)

    Read the article

  • R strsplit and vectorization

    - by James
    When creating functions that use strsplit, vector inputs do not behave as desired, and sapply needs to be used. This is due to the list output that strsplit produces. Is there a way to vectorize the process - that is, the function produces the correct element in the list for each of the elements of the input? For example, to count the lengths of words in a character vector: words <- c("a","quick","brown","fox") > length(strsplit(words,"")) [1] 4 # The number of words (length of the list) > length(strsplit(words,"")[[1]]) [1] 1 # The length of the first word only > sapply(words,function (x) length(strsplit(x,"")[[1]])) a quick brown fox 1 5 5 3 # Success, but potentially very slow Ideally, something like length(strsplit(words,"")[[.]]) where . is interpreted as the being the relevant part of the input vector.

    Read the article

  • Atomic INSERT/SELECT in HSQLDB

    - by PartlyCloudy
    Hello, I have the following hsqldb table, in which I map UUIDs to auto incremented IDs: SHORT_ID (BIG INT, PK, auto incremented) | UUID (VARCHAR, unique) Create command: CREATE TABLE table (SHORT_ID BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY, UUID VARCHAR(36) UNIQUE) In order to add new pairs concurrently, I want to use the atomic MERGE INTO statement. So my (prepared) statement looks like this: MERGE INTO table USING (VALUES(CAST(? AS VARCHAR(36)))) AS v(x) ON ID_MAP.UUID = v.x WHEN NOT MATCHED THEN INSERT VALUES v.x When I execute the statement (setting the placeholder correctly), I always get a Caused by: org.hsqldb.HsqlException: row column count mismatch Could you please give me a hint, what is going wrong here? Thanks in advance.

    Read the article

  • will_paginate undefined method error - Ruby on Rails

    - by bgadoci
    I just installed the gem for will_paginate and it says that it was installed successfully. I followed all the instructions listed with the plugin and I am getting an 'undefined method `paginate' for' error. Can't find much in the way of Google search and haven't been able to fix it myself (obviously). Here is the code: PostsController def index @tag_counts = Tag.count(:group => :tag_name, :order => 'updated_at DESC', :limit => 10) @posts = Post.paginate :page => params[:page], :per_page => 50 respond_to do |format| format.html # index.html.erb format.xml { render :xml => @posts } format.json { render :json => @posts } format.atom end end /model/post.rb class Post < ActiveRecord::Base validates_presence_of :body, :title has_many :comments, :dependent => :destroy has_many :tags, :dependent => :destroy cattr_reader :per_page @@per_page = 10 end /posts/views/index.html.erb <%= will_paginate @posts %>

    Read the article

  • How to optimize this MySQL query

    - by James Simpson
    This query was working fine when the database was small, but now that there are millions of rows in the database, I am realizing I should have looked at optimizing this earlier. It is looking at over 600,000 rows and is Using where; Using temporary; Using filesort (which leads to an execution time of 5-10 seconds). It is using an index on the field 'battle_type.' SELECT username, SUM( outcome ) AS wins, COUNT( * ) - SUM( outcome ) AS losses FROM tblBattleHistory WHERE battle_type = '0' && outcome < '2' GROUP BY username ORDER BY wins DESC , losses ASC , username ASC LIMIT 0 , 50

    Read the article

  • Cassandra/HBase or just MySQL: Potential problems doing the next thing

    - by alexeypro
    Say I have "user". It's the key. And I need to keep "user count". I am planning to have record with key "user" and value "0" to "9999+ ;-)" (as many as I'll have). What problems I will drive in if I use Cassandra, HBase or MySQL for that? Say, I have thousand of new updates to this "user" key, where I need to increment the value. Am I in trouble? Locked for writes? Any other way of doing that? Why this is done -- there will be a lot of "user"-like keys. Different other cases. But the idea is the same. Why keep it this way -- because I'll have more reads, so I can always get "counted value" very fast.

    Read the article

  • retreive POST data from FLASH to ASP.Net

    - by Martin Ongtangco
    here's my AS3 code: var jpgEncoder:JPGEncoder = new JPGEncoder(100); var jpgStream:ByteArray = jpgEncoder.encode(bitmapData); var header:URLRequestHeader = new URLRequestHeader("Content-type", "application/octet-stream"); var jpgURLRequest:URLRequest = new URLRequest("/patients/webcam.aspx"); jpgURLRequest.requestHeaders.push(header); jpgURLRequest.method = URLRequestMethod.POST; jpgURLRequest.data = jpgStream; navigateToURL(jpgURLRequest, "_self"); And here's my ASP.Net Code try { string pt = Path.Combine(PathFolder, "test.jpg"); HttpFileCollection fileCol = Request.Files; Response.Write(fileCol.Count.ToString()); foreach (HttpPostedFile hpf in fileCol) { hpf.SaveAs(pt); } } catch (Exception ex) { Response.Write(ex.Message); } im getting a weird error, HttpFox mentioned: "NS_ERROR_NET_RESET" Any help would be excellent! Thanks!

    Read the article

  • C# yield return not returning an item as expected

    - by Jiho Han
    I have following code: private void ProcessQueue() { foreach (MessageQueueItem item in GetNextQueuedItem()) PerformAction(item); } private IEnumerable<MessageQueueItem> GetNextQueuedItem() { if (_messageQueue.Count > 0) yield return _messageQueue.Dequeue(); } Initially there is one item in the queue as ProcessQueue is called. During PerformAction, I would add more items to _messageQueue. However, the foreach loop quits after the initial item and does not see the subsequent items added. I sense that somehow the initial state of the queue is being captured by yield. Can someone explain what is happening and give a solution?

    Read the article

  • Saving State Dynamic UserControls...Help!

    - by Cognitronic
    I have page with a LinkButton on it that when clicked, I'd like to add a Usercontrol to the page. I need to be able to add/remove as many controls as the user would like. The Usercontrol consists of three dropdownlists. The first dropdownlist has it's auotpostback property set to true and hooks up the OnSelectedIndexChanged event that when fired will load the remaining two dropdownlists with the appropriate values. My problem is that no matter where I put the code in the host page, the usercontrol is not being loaded properly. I know I have to recreate the usercontrols on every postback and I've created a method that is being executed in the hosting pages OnPreInit method. I'm still getting the following error: The control collection cannot be modified during DataBind, Init, Load, PreRender or Unload phases. Here is my code: Thank you!!!! bool createAgain = false; IList<FilterOptionsCollectionView> OptionControls { get { if (SessionManager.Current["controls"] != null) return (IList<FilterOptionsCollectionView>)SessionManager.Current["controls"]; else SessionManager.Current["controls"] = new List<FilterOptionsCollectionView>(); return (IList<FilterOptionsCollectionView>)SessionManager.Current["controls"]; } set { SessionManager.Current["controls"] = value; } } protected void Page_Load(object sender, EventArgs e) { Master.Page.Title = Title; LoadViewControls(Master.MainContent, Master.SideBar, Master.ToolBarContainer); } protected override void OnPreInit(EventArgs e) { base.OnPreInit(e); System.Web.UI.MasterPage m = Master; Control control = GetPostBackControl(this); if ((control != null && control.ClientID == (lbAddAndCondtion.ClientID) || createAgain)) { createAgain = true; CreateUserControl(control.ID); } } protected void AddAndConditionClicked(object o, EventArgs e) { var control = LoadControl("~/Views/FilterOptionsCollectionView.ascx"); OptionControls.Add((FilterOptionsCollectionView)control); control.ID = "options" + OptionControls.Count.ToString(); phConditions.Controls.Add(control); } public event EventHandler<Insight.Presenters.PageViewArg> OnLoadData; private Control FindControlRecursive(Control root, string id) { if (root.ID == id) { return root; } foreach (Control c in root.Controls) { Control t = FindControlRecursive(c, id); if (t != null) { return t; } } return null; } protected Control GetPostBackControl(System.Web.UI.Page page) { Control control = null; string ctrlname = Page.Request.Params["__EVENTTARGET"]; if (ctrlname != null && ctrlname != String.Empty) { control = FindControlRecursive(page, ctrlname.Split('$')[2]); } else { string ctrlStr = String.Empty; Control c = null; foreach (string ctl in Page.Request.Form) { if (ctl.EndsWith(".x") || ctl.EndsWith(".y")) { ctrlStr = ctl.Substring(0, ctl.Length - 2); c = page.FindControl(ctrlStr); } else { c = page.FindControl(ctl); } if (c is System.Web.UI.WebControls.CheckBox || c is System.Web.UI.WebControls.CheckBoxList) { control = c; break; } } } return control; } protected void CreateUserControl(string controlID) { try { if (createAgain && phConditions != null) { if (OptionControls.Count > 0) { phConditions.Controls.Clear(); foreach (var c in OptionControls) { phConditions.Controls.Add(c); } } } } catch (Exception ex) { throw ex; } } Here is the usercontrol's code: <%@ Control Language="C#" AutoEventWireup="true" CodeBehind="FilterOptionsCollectionView.ascx.cs" Inherits="Insight.Website.Views.FilterOptionsCollectionView" %> namespace Insight.Website.Views { [ViewStateModeById] public partial class FilterOptionsCollectionView : System.Web.UI.UserControl { protected void Page_Load(object sender, EventArgs e) { } protected override void OnInit(EventArgs e) { LoadColumns(); ddlColumns.SelectedIndexChanged += new RadComboBoxSelectedIndexChangedEventHandler(ColumnsSelectedIndexChanged); base.OnInit(e); } protected void ColumnsSelectedIndexChanged(object o, EventArgs e) { LoadCriteria(); } public void LoadColumns() { ddlColumns.DataSource = User.GetItemSearchProperties(); ddlColumns.DataTextField = "SearchColumn"; ddlColumns.DataValueField = "CriteriaSearchControlType"; ddlColumns.DataBind(); LoadCriteria(); } private void LoadCriteria() { var controlType = User.GetItemSearchProperties()[ddlColumns.SelectedIndex].CriteriaSearchControlType; var ops = User.GetItemSearchProperties()[ddlColumns.SelectedIndex].ValidOperators; ddlOperators.DataSource = ops; ddlOperators.DataTextField = "key"; ddlOperators.DataValueField = "value"; ddlOperators.DataBind(); switch (controlType) { case ResourceStrings.ViewFilter_ControlTypes_DDL: criteriaDDL.Visible = true; criteriaText.Visible = false; var crit = User.GetItemSearchProperties()[ddlColumns.SelectedIndex].SearchCriteria; ddlCriteria.DataSource = crit; ddlCriteria.DataBind(); break; case ResourceStrings.ViewFilter_ControlTypes_Text: criteriaDDL.Visible = false; criteriaText.Visible = true; break; } } public event EventHandler OnColumnChanged; public ISearchCriterion FilterOptionsValues { get; set; } } }

    Read the article

  • C# WPF application is using too much memory while GC.GetTotalMemory() is low

    - by Dmitry
    I wrote little WPF application with 2 threads - main thread is GUI thread and another thread is worker. App has one WPF form with some controls. There is a button, allowing to select directory. After selecting directory, application scans for .jpg files in that directory and checks if their thumbnails are in hashtable. if they are, it does nothing. else it's adding their full filenames to queue for worker. Worker is taking filenames from this queue, loading JPEG images (using WPF's JpegBitmapDecoder and BitmapFrame), making thumbnails of them (using WPF's TransformedBitmap) and adding them to hashtable. Everything works fine, except memory consumption by this application when making thumbnails for big images (like 5000x5000 pixels). I've added textboxes on my form to show memory consumption (GC.GetTotalMemory() and Process.GetCurrentProcess().PrivateMemorySize64) and was very surprised, cuz GC.GetTotalMemory() stays close to 1-2 Mbytes, while private memory size constantly grows, especially when loading new image (~ +100Mb per image). Even after loading all images, making thumbnails of them and freeing original images, private memory size stays at ~700-800Mbytes. My VirtualBox is limited to 512Mb of physical memory and Windows in VirtualBox starts to swap alot to handle this huge memory consumption. I guess I'm doing something wrong, but I don't know how to investigate this problem, cuz according to GC, allocated memory size is very low. Attaching code of thumbnail loader class: class ThumbnailLoader { Hashtable thumbnails; Queue<string> taskqueue; EventWaitHandle wh; Thread[] workers; bool stop; object locker; int width, height, processed, added; public ThumbnailLoader() { int workercount,i; wh = new AutoResetEvent(false); thumbnails = new Hashtable(); taskqueue = new Queue<string>(); stop = false; locker = new object(); width = height = 64; processed = added = 0; workercount = Environment.ProcessorCount; workers=new Thread[workercount]; for (i = 0; i < workercount; i++) { workers[i] = new Thread(Worker); workers[i].IsBackground = true; workers[i].Priority = ThreadPriority.Highest; workers[i].Start(); } } public void SetThumbnailSize(int twidth, int theight) { width = twidth; height = theight; if (thumbnails.Count!=0) AddTask("#resethash"); } public void GetProgress(out int Added, out int Processed) { Added = added; Processed = processed; } private void AddTask(string filename) { lock(locker) { taskqueue.Enqueue(filename); wh.Set(); added++; } } private string NextTask() { lock(locker) { if (taskqueue.Count == 0) return null; else { processed++; return taskqueue.Dequeue(); } } } public static string FileNameToHash(string s) { return FormsAuthentication.HashPasswordForStoringInConfigFile(s, "MD5"); } public bool GetThumbnail(string filename,out BitmapFrame thumbnail) { string hash; hash = FileNameToHash(filename); if (thumbnails.ContainsKey(hash)) { thumbnail=(BitmapFrame)thumbnails[hash]; return true; } AddTask(filename); thumbnail = null; return false; } private BitmapFrame LoadThumbnail(string filename) { FileStream fs; JpegBitmapDecoder bd; BitmapFrame oldbf, bf; TransformedBitmap tb; double scale, dx, dy; fs = new FileStream(filename, FileMode.Open); bd = new JpegBitmapDecoder(fs, BitmapCreateOptions.None, BitmapCacheOption.OnLoad); oldbf = bd.Frames[0]; dx = (double)oldbf.Width / width; dy = (double)oldbf.Height / height; if (dx > dy) scale = 1 / dx; else scale = 1 / dy; tb = new TransformedBitmap(oldbf, new ScaleTransform(scale, scale)); bf = BitmapFrame.Create(tb); fs.Close(); oldbf = null; bd = null; GC.Collect(); return bf; } public void Dispose() { lock(locker) { stop = true; } AddTask(null); foreach (Thread worker in workers) { worker.Join(); } wh.Close(); } private void Worker() { string curtask,hash; while (!stop) { curtask = NextTask(); if (curtask == null) wh.WaitOne(); else { if (curtask == "#resethash") thumbnails.Clear(); else { hash = FileNameToHash(curtask); try { thumbnails[hash] = LoadThumbnail(curtask); } catch { thumbnails[hash] = null; } } } } } }

    Read the article

  • UISegmentedControl tint color on touch

    - by gotye
    Hey everyone, I have a UISegmentedControl in my app (see code below) : // --------------- SETTING NAVIGATION BAR RIGHT BUTTONS NSArray *segControlItems = [NSArray arrayWithObjects:[UIImage imageNamed:@"up.png"],[UIImage imageNamed:@"down.png"], nil]; segControl = [[UISegmentedControl alloc] initWithItems:segControlItems]; segControl.segmentedControlStyle = UISegmentedControlStyleBar; segControl.momentary = YES; segControl.frame = CGRectMake(25.0, 7, 65.0, 30.0); segControl.tintColor = [UIColor blackColor]; [segControl addTarget:self action:@selector(segAction:) forControlEvents:UIControlEventValueChanged]; if (current == 0) [segControl setEnabled:NO forSegmentAtIndex:0]; if (current == ([news count]-1)) [segControl setEnabled:NO forSegmentAtIndex:1]; // --------------- But I can't make it to show something when you click on it ... It functionnally works perfectly but I would like it to tint to gray when you click on it (but just when you click) ... would that be possible ? Thank you, Gotye.

    Read the article

  • how callback a function on 404 on JSON ajax request with Jquery?

    - by shingara
    I want made an Ajax request with response on JSON. So I made this Ajax request : $.ajax({ url: 'http://my_url', dataType: "json", success: function(data){ alert('success'); }, error: function(data){ alert('error'); }, complete: function(data) { alert('complete') }}) This code works good but when my url send me a HTTP code 404, no callbacks are used, even the complete callback. After research, it's because my dataType is 'json' so 404 return is HTML and the JSON parsing failed. So no callback. Have you a solution to call a callback function when a 404 is raised ? EDIT: complete callback don't call is return is 404. If you want an URL wit 404 you can call : http://twitter.com/status/user_timeline/jksqdlmjmsd.json?count=3&callback=jsonp1269278524295&_=1269278536697 it's with this URL I have my problem.

    Read the article

  • netlogo programming question - catalyst implementation part 2

    - by user286190
    hi the catalyst speeds up the reaction but remains unchanged after the reaction has taken place i tried the following code breed [catalysts catalyst] breed [chemical-x chemical-x] ;then the forward reaction is sped up by the existence of catalysts to react-forward let num-catalysts count catalysts ;speed up by num-catalysts ;... end and it works fine but I want to make it so that the catalyst can be switched on and off with the 'switch' button ..so one can see the effects with and without the catalyst..i tried putting a switch in but catalyst has already been defined Also i want to make the catalyst visible so one can see it in the actual implementation (in the world) like making it a turtle is there are another way to implement this apart from using breeds i tried making the catalyst a turtle but it doesnt work ; Make catalyst visible in implementation clear-all crt catalysts 100 ask catalysts [ set color white ] show [breed] of one-of catalysts ; prints catalysts any help will be greatly appreciated thank you

    Read the article

  • Net::Cassandra::Easy equivalent of "SELECT * FROM ..."

    - by knorv
    When using Perl's Net::Cassandra::Easy the following code will retrieve columns col[1-3] from rows row[1-3]: $result = $cassandra->get(['row1', 'row2', 'row3'], family => 'Standard1', byname => ['col1', 'col2', 'col3'); The corresponding SQL would be: SELECT col1, col2, col3 FROM rows WHERE id IN ('row1', 'row2', 'row3'); Suppose instead that I want to retrieve all columns. In SQL terms that would be: SELECT * FROM rows WHERE id IN ('row1', 'row2', 'row3'); To get all columns I am currently using: $result = $cassandra->get(['row1', 'row2', 'row3'], family => 'Standard1', byoffset => { "count" => 1_000_000 }); This works as long as the number of columns does not exceed one million. While this works I'd assume that there is a cleaner way to do it. My question is: Is there any cleaner way to specify to Cassandra that I want to retrieve all columns for the maching rows?

    Read the article

  • Forward all traffic through an ssh tunnel

    - by Eamorr
    I hope someone can follow this and I'll explain as best I can. I'm trying to forward all traffic from port 6999 on x.x.x.224, through an ssh tunnel, and onto port 7000 on x.x.x.218. Here is some ASCII art: |browser|-----|Squid on x.x.x.224|------|ssh tunnel|------<satellite link>-----|Squid on x.x.x.218|-----|www| 3128 6999 7000 80 When I remove the ssh tunnel, everything works fine. The idea is to turn off encryption on the ssh tunnel (to save bandwidth) and turn on maximum compression (to save more bandwidth). This is because it's a satellite link. Here's the ssh tunnel I've been using: ssh -C -f -C -o CompressionLevel=9 -o Cipher=none [email protected] -L 7000:172.16.1.224:6999 -N The trouble is, I don't know how to get data from Squid on x.x.x.224 into the ssh tunnel? Am I going about this the wrong way? Should I create an ssh tunnel on x.x.x.218? I use iptables to stop squid on x.x.x.224 from reading port 80, but to feed from port 6999 instead (i.e. via the ssh tunnel). Do I need another iptables rule? Any comments greatly appreciated. Many thanks in advance, Regarding Eduardo Ivanec's question, here is a netstat -i any port 7000 -nn dump from x.x.x.218: 14:42:15.386462 IP 172.16.1.224.40006 > 172.16.1.218.7000: Flags [S], seq 2804513708, win 14600, options [mss 1460,sackOK,TS val 86702647 ecr 0,nop,wscale 4], length 0 14:42:15.386690 IP 172.16.1.218.7000 > 172.16.1.224.40006: Flags [R.], seq 0, ack 2804513709, win 0, length 0 Update 2: When I run the second command, I get the following error in my browser: ERROR The requested URL could not be retrieved The following error was encountered while trying to retrieve the URL: http://109.123.109.205/index.php Zero Sized Reply Squid did not receive any data for this request. Your cache administrator is webmaster. Generated Fri, 01 Jul 2011 16:06:06 GMT by remote-site (squid/2.7.STABLE9) remote-site is 172.16.1.224 When I do a tcpdump -i any port 7000 -nn I get the following: root@remote-site:~# tcpdump -i any port 7000 -nn tcpdump: verbose output suppressed, use -v or -vv for full protocol decode listening on any, link-type LINUX_SLL (Linux cooked), capture size 65535 bytes channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused channel 2: open failed: connect failed: Connection refused

    Read the article

  • c# gridview row click

    - by Martijn
    When i click on a row in my gridview, i want to go to a other page with the id i get from the database. In my RowCreated event i have the following line: e.Row.Attributes.Add("onClick", ClientScript.GetPostBackClientHyperlink(this.grdSearchResults, "Select$" + e.Row.RowIndex)); To prevent error messages i have this code: protected override void Render(HtmlTextWriter writer) { // .NET will refuse to accept "unknown" postbacks for security reasons. Because of this we have to register all possible callbacks // This must be done in Render, hence the override for (int i = 0; i < grdSearchResults.Rows.Count; i++) { Page.ClientScript.RegisterForEventValidation(new System.Web.UI.PostBackOptions(grdSearchResults, "Select$" + i.ToString())); } // Do the standard rendering stuff base.Render(writer); } My question is, how can i give a row a unique id (from the DB) and when i click the row, another page is opened (like clicking on a href) and that page can read the id. Thnx

    Read the article

< Previous Page | 296 297 298 299 300 301 302 303 304 305 306 307  | Next Page >