Search Results

Search found 6634 results on 266 pages for 'fast fashion'.

Page 173/266 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • WCF Caching Solution - Need Advice

    - by Brandon
    The company I work for is looking to implement a caching solution. We have several WCF Web Services hosted and we need to cache certain values that can be persisted and fetched regardless of a client's session to a service. I am looking at the following technologies: Caching Application Block 4.1 WCF TCP Service using HttpRuntime Caching Memcached Win32 and Client Microsoft AppFabric Caching Beta 2 Our test server is a Windows Server 2003 with IIS6, but our production server is Windows Server 2008, so any of the above options would work (except for AppFabric Caching on our test server). Does anyone have any experience with any of these? This caching solution will not be used to store a lot of data, but it will need to be fetched from frequently and fast. Thanks in advance.

    Read the article

  • Javascriptlibrary more efficient than Rickshaw for realtime visualizations

    - by dan kutz
    I want to visualize data as time-series graphs on mobile devices(tablets) and therefore stumbled upon rickshaw, which is based on D3. First I must say I was a little bit confused when I realized that realtime in web design is defined totally different to realtime in engineering which has fixed(and often very short) timeframes. Anyway my aim is to visualize the data as fast as possible, and on older tablets visualization with rickshaw is quite slow. Can anybody recommend another library, which may be more efficient in rendering? Or is there no way out and I have to go native? regards Dan.

    Read the article

  • Does any faster centralized version control than SVN exists?

    - by Savageman
    Hello, I've been using SVN since a long time and now we're trying on Git. I'm not talking on the centralized / decentralized debate here. My only concern is speed. The latter tool is much faster. But sometimes, I NEED to work with a centralized approach, which is much more simple and less complex than the decentralized one. The learning curve is really fast, which saves a lot of time (while digging into decentralized would lead to a waste of time, given the learning curve is much longer and we encounter more problem when working with it). However, SVN is really slow compared to GIT, and I don't think it has anything to do with the centralized argument. Decentralized systems also have to deal with server connections and file transfert. So I can easilly imagine a faster implementation of centralized version control could exists. Does someone has any clue on this?

    Read the article

  • Nested query to find details in table B for maximum value in table A

    - by jpatokal
    I've got a huge bunch of flights travelling between airports. Each airport has an ID and (x,y) coordinates. For a given list of flights, I want to find the northernmost (highest x) airport visited. Here's the query I'm currently using: SELECT name,iata,icao,apid,x,y FROM airports WHERE y=(SELECT MAX(y) FROM airports AS a , flights AS f WHERE (f.src_apid=a.apid OR f.dst_apid=a.apid) ) This works beautifully and reasonably fast as long as y is unique, but fails once it isn't. What I'd want to do instead is find the MAX(y) in the subquery, but return the unique apid for the airport with the highest y. Any suggestions?

    Read the article

  • Utility of List<T>.Sort() versus List<T>.OrderBy() for a member of a custom container class

    - by ccomet
    I've found myself running back through some old 3.5 framework legacy code, and found some points where there are a whole bunch of lists and dictionaries that must be updated in a synchronized fashion. I've determined that I can make this process infinitely easier to both utilize and understand by converging these into custom container classes of new custom classes. There are some points, however, where I came to concerns with organizing the contents of these new container classes by a specific inner property. For example, sorting by the ID number property of one class. As the container classes are primarily based around a generic List object, my first instinct was to write the inner classes with IComparable, and write the CompareTo method that compares the properties. This way, I can just call items.Sort() when I want to invoke the sorting. However, I've been thinking instead about using items = items.OrderBy(Func) instead. This way it is more flexible if I need to sort by any other property. Readability is better as well, since the property used for sorting will be listed in-line with the sort call rather than having to look up the IComparable code. The overall implementation feels cleaner as a result. I don't care for premature or micro optimization, but I like consistency. I find it best to stick with one kind of implementation for as many cases as it is appropriate, and use different implementations where it is necessary. Is it worth it to convert my code to use the LINQ OrderBy instead of using List.Sort? Is it a better practice to stick with the IComparable implementation for these custom containers? Are there any significant mechanical advantages offered by either path that I should be weighing the decision on? Or is their end-functionality equivalent to the point that it just becomes coder's preference?

    Read the article

  • How to calculate 2^n-1 efficiently without overflow?

    - by Ludwig Weinzierl
    I want to calculate 2^n-1 for a 64bit integer value. What I currently do is this for(i=0; i<n; i++) r|=1<<i; and I wonder if there is more elegant way to do it. The line is in an inner loop, so I need it to be fast. I thought of r=(1ULL<<n)-1; but it doesn't work for n=64, because << is only defined for values of n up to 63.

    Read the article

  • label in my tabel celll

    - by ven in Iphone world
    hi this is lak.. thank you for your fast feedback but that not worked for me i am using labels in table.. UILabel *label1 = (UILabel *) [cell viewWithTag:1]; label1.backgroundColor = [UIColor clearColor]; label1.text=aStation.station_name; label1.textColor = [UIColor colorWithRed:0.76 green:0.21 blue:0.07 alpha:1.0]; [label1 setFont:[UIFont fontWithName:@"Trebuchet MS" size:15]]; for this type of label i want to limit number of characters. hope i will get an answer..

    Read the article

  • Python fCGI + sqlAlchemy = malformed header from script. Bad header=FROM tags : index.py

    - by crgwbr
    I'm writing an Fast-CGI application that makes use of sqlAlchemy & MySQL for persistent data storage. I have no problem connecting to the DB and setting up ORM (so that tables get mapped to classes); I can even add data to tables (in memory). But, as soon as I query the DB (and push any changes from memory to storage) I get a 500 Internal Server Error and my error.log records malformed header from script. Bad header=FROM tags : index.py, when tags is the table name. Any idea what could be causing this? Also, I don't think it matters, but its a Linux development server talking to an off-site (across the country) MySQL server.

    Read the article

  • querying larg text file containing JSON objects.

    - by Maciek Sawicki
    Hi, I have few Gigabytes text file in format: {"user_ip":"x.x.x.x", "action_type":"xxx", "action_data":{"some_key":"some_value"...},...} each entry is one line. First I would like to easily find entries for given ip. This part is easy because I can use grep for example. However even for this I would like to find better solution because I would like to get response as fast as possible. Next part is more complicated because I would like to find entries from selected ip and of selected type and with particular value of some_key in action_data. Probably I would have to convert this file to SQL db (probably SQLite, because it will be desktop APP), but I would ask if there are exists better solutions?

    Read the article

  • google spreadsheet api from android: google-api-java-client or handmade?

    - by yetanothercoderu
    For working with google spreadsheet api from android (2.2) - google suggests using google-api-java-client for android. For that you have to include 5 jars to your android application: guava-r09.jar google-http-client-extensions-android2-1.6.0-beta.jar google-api-client-extensions-android2-1.6.0-beta.jar google-http-client-1.6.0-beta.jar google-api-client-1.6.0-beta.jar and digging into google-api-java-client javadocs for fast-changing api. Does it worth the effort? in term of android specifics and device fragmentation? Isn't it reasonable to write your own simple http response parser or take small existing library like google-spreadsheet-lib-android ? Thanks! UPD: choosed google-api-java-client finally as it has all routine stuff (like parsing http, xml) out of box

    Read the article

  • HBase as web app backend

    - by NathanD
    Can anyone advise if it is a good idea to have HBase as primary data source for web-based application? My primary concern is HBase's response time to queries. Is it possible to have sub-second response? edit: more details about the app itself. Amount of data: ~500GB of text data, expect to reach 1TB soon Number of concurrent users using the app: up to 50 The app will be used to present reports about data stored in HBase, like how many times keyword "X" occured in last 24h. For ~80% of requests from that app I will know the exact key, 20% will be scans (I'm looking into HBase schema design related topics to make it run fast)

    Read the article

  • validate constructor arguments or method parameters with annotations, and let them throw an exceptio

    - by marius
    I am validating constructor and method arguments, as I want to the software, especially the model part of it, to fail fast. As a result, constructor code often looks like this public MyModelClass(String arg1, String arg2, OtherModelClass otherModelInstance) { if(arg1 == null) { throw new IllegalArgumentsException("arg1 must not be null"); } // further validation of constraints... // actual constructor code... } Is there a way to do that with an annotation driven approach? Something like: public MyModelClass(@NotNull(raise=IllegalArgumentException.class, message="arg1 must not be null") String arg1, @NotNull(raise=IllegalArgumentException.class) String arg2, OtherModelClass otherModelInstance) { // actual constructor code... } In my eyes this would make the actual code a lot more readable. In understand that there are annotations in order to support IDE validation (like the existing @NotNull annotation). Thank you very much for your help.

    Read the article

  • What's a better choice for SQL-backed number crunching - Ruby 1.9, Python 2, Python 3, or PHP 5.3?

    - by Ivan
    Crterias of 'better': fast im math and simple (little of fields, many records) db transactions, convenient to develop/read/extend, flexible, connectible. The task is to use a common web development scripting language to process and calculate long time series and multidimensional surfaces (mostly selectint/inserting sets of floats and dong maths with rhem). The choice is Ruby 1.9, Python 2, Python 3, PHP 5.3, Perl 5.12, JavaScript (node.js). All the data is to be stored in a relational database (due to its heavily multidimensional nature), all the communication with outer world is to be done by means of web services.

    Read the article

  • keep open windows console after a python syntax error

    - by basweber
    File associations on my machine (winxp home) are such that a python script is directly opened with the python interpreter. If I double click on a python script a console window runs and every thing is fine - as long as there is no syntax error in the script. In that case the console window opens up for a moment but it is closed immediately. Too fast to read the error message. Of course their would be the possibility to manually open a console window and to execute the script by typing python myscript.py but I am sure that there is a more convenient (i.e. "double click based") solution.

    Read the article

  • Can this rectangle to rectangle intersection code still work?

    - by Jeremy Rudd
    I was looking for a fast performing code to test if 2 rectangles are intersecting. A search on the internet came up with this one-liner (WOOT!), but I don't understand how to write it in Javascript, it seems to be written in an ancient form of C++. Can this thing still work? Can you make it work? struct { LONG left; LONG top; LONG right; LONG bottom; } RECT; bool IntersectRect(const RECT * r1, const RECT * r2) { return ! ( r2->left > r1->right || r2->right left || r2->top > r1->bottom || r2->bottom top ); }

    Read the article

  • Animate an Image after hovering an item a second

    - by mikep
    Hey, after some tries to get this to work, i ask you, if you know where my mistake is. This is my code until now: $(".menu a").hover( function () { $(this).data('timeout', setTimeout( function () { $(this).hover(function() { $(this).next("em").animate({opacity: "show", top: "-65"}, "slow"); }, function() { $(this).next("em").animate({opacity: "hide", top: "-75"}, "fast"); }); }, 1000)); }, function () { clearTimeout($(this).data('timeout')); }); i would be happy about some help.

    Read the article

  • PHP APC - Why is loading cached array op codes slow?

    - by Aaron Kreider
    I'm using APC to reduce my loading time for my PHP files. My files load very fast, except for one file where I define more than 100 arrays. This 270 kb file takes 200 ms to load. The rest of the files are full of objects, methods, and functions. I'm wondering: does OP code caching not work as well for arrays? My APC cache should be big enough to handle all of my classes. Currently 40% of my cache is free. My hit rate is 99%. apc.shm_size=32 M apc.max_file_size = 1M apc.shm_segments= 1 APC 3.1.6 I'm using PHP 5.2, Apache 2, and Windows Vista.

    Read the article

  • Get only new RSS entries with PHP Script ?

    - by ArneRie
    What im trying to do: Fetch X numbers of RSS Feeds from my Blogs and echo only new entries. My Problem is, how to know wich items are already parsed? Solution so far: Fetch the Feed every 5 hours, store all titles inside an Database table or flat file. Next run check if the title is already in database if not print it and save it inside the database. But iam not sure if this is best practise to do this? If someone knows a fast way, it would be great. Sorry for my poor english.

    Read the article

  • What should be taught in a "Fundamentals of programming" course at university?

    - by Dervin Thunk
    I have started a new question (see here), because I think the topic is of importance in a more general form. The question is now: If you were a professor at a Computer Science Dept. in some university, what would make it into your course? This is a programming course, second term, first year computer science/computer engineering. Remember you have a limited amount of time, and students are of different levels of competence, and some may be scientists, but some will also go on to be programmers in companies of different kinds. You have to cater to all. Bonus: What language? (Although see this question for my current thoughts about this...) Maybe you want to attach a course outline from some university? See here for an even more general question about this. Answer: I can't really summarize this post... I guess it was too subjective. However, it looks like we have to cover the history of computing up to a certain extent, computer architecture (memory, registers, whatever), C, and finally some basic algos and data structures in a problem solving fashion. This will be the bare bones of the course. Thanks all. I will accept the most voted up answer to close the thread, as it should be done.

    Read the article

  • Optimizing MySQL statement with lot of count(row) an sum(row+row2)...

    - by Zombies
    I need to use InnoDB storage engine on a table with about 1mil or so records in it at any given time. It has records being inserted to it at a very fast rate, which are then dropped within a few days, maybe a week. The ping table has about a million rows, whereas the website table only about 10,000. My statement is this: select url from website ws, ping pi where ws.idproxy = pi.idproxy and pi.entrytime > curdate() - 3 and contentping+tcpping is not null group by url having sum(contentping+tcpping)/(count(*)-count(errortype)) < 500 and count(*) > 3 and count(errortype)/count(*) < .15 order by sum(contentping+tcpping)/(count(*)-count(errortype)) asc; I added an index on entrytime, yet no dice. Can anyone throw me a bone as to what I should consider to look into for basic optimization of this query. The result set is only like 200 rows, so I'm not getting killed there.

    Read the article

  • Can this loop be sped up in pure Python?

    - by Noctis Skytower
    I was trying out an experiment with Python, trying to find out how many times it could add one to an integer in one minute's time. Assuming two computers are the same except for the speed of the CPUs, this should give an estimate of how fast some CPU operations may take for the computer in question. The code below is an example of a test designed to fulfill the requirements given above. This version is about 20% faster than the first attempt and 150% faster than the third attempt. Can anyone make any suggestions as to how to get the most additions in a minute's time span? Higher numbers are desireable. EDIT: This experiment is being written in Python 3.1 and is 15% faster than the fourth speed-up attempt. def start(seconds): import time, _thread def stop(seconds, signal): time.sleep(seconds) signal.pop() total, signal = 0, [None] _thread.start_new_thread(stop, (seconds, signal)) while signal: total += 1 return total if __name__ == '__main__': print('Testing the CPU speed ...') print('Relative speed:', start(60))

    Read the article

  • quaring larg text file containing JSON objects.

    - by Maciek Sawicki
    Hi, I have few Gigabytes text file in format: {"user_ip":"x.x.x.x", "action_type":"xxx", "action_data":{"some_key":"some_value"...},...} each entry is one line. First I would like to easily find entries for given ip. This part is easy because I can use grep for example. However even for this I would like to find better solution because I would like to get response as fast as possible. Next part is more complicated because I would like to find entries from selected ip and of selected type and with particular value of some_key in action_data. Probably I would have to convert this file to SQL db (probably SQLite, because it will be desktop APP), but I would ask if there are exists better solutions?

    Read the article

  • Loading a DB table into nested dictionaries in Python

    - by Hossein
    Hi, I have a table in MySql DB which I want to load it to a dictionary in python. the table columns is as follows: id,url,tag,tagCount tagCount is the number of times that a tag has been repeated for a certain url. So in that case I need a nested dictionary, in other words a dictionary of dictionary, to load this table. Because each url have several tags for which there are different tagCounts.the code that I used is this:( the whole table is about 22,000 records ) cursor.execute( ''' SELECT url,tag,tagCount FROM wtp ''') urlTagCount = cursor.fetchall() d = defaultdict(defaultdict) for url,tag,tagCount in urlTagCount: d[url][tag]=tagCount print d first of all I want to know if this is correct.. and if it is why it takes so much time? Is there any faster solutions? I am loading this table into memory to have fast access to get rid of the hassle of slow database operations, but with this slow speed it has become a bottleneck itself, it is even much slower than DB access. and anyone help? thanks

    Read the article

  • Migrate from VB.net to C#

    - by rowmark
    Hello Experts, I have been developing applications using VB.net for the past 5 years. As I tried to learn Java earlier and found it very difficult for me I did stick on to VB.net. And for me C# is more or less similar to Java. Now I cannot get away with it. I have to code on C#. Is there a way I can get to speed with C# fast. I would really appreciate if you can let me know your thoughts and if there are any good resources I can try.

    Read the article

  • Is there a generic way of dealing with varying connection strings in C#?

    - by James Wiseman
    I have an application that needs to connect to a SQL database, and execute a SQL Agent Job. The connection string I am trying to access is stored in the registry, which is easily enough pulled out. This appliction is to be run on multiple computers, and I cannot guarantee the format of this connection string being consistent across these computers. Two that I have pulled out for example are: Data Source=Server1;Initial Catalog=DB1;Integrated Security=SSPI; Data Source=Server2;Initial Catalog=DB1;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False; I can use an object of type System.Data.SqlClient.SqlConnection to connect to the database with the first connection string, howevever, I get the following error when I pass the second to it: keyword not supported: 'provider' Similarly, I can use the an object of type System.Data.OleDb.OleDbConnection to connect to the database with the second connection string, howevever, I get the following error when I pass the first to it: An OLEDB Provider was not specified in the ConnectionString' I can solve this by scanning the string for 'Provider' and doing the connect conditionally, however I can't help but feel that there is a better way of doing this, and handle the connection strings in a more generic fashion. Does anyone have any suggestions?

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >