Search Results

Search found 3265 results on 131 pages for 'parallel coordinates'.

Page 80/131 | < Previous Page | 76 77 78 79 80 81 82 83 84 85 86 87  | Next Page >

  • How SSD's drives reduce their latency?

    - by tigrou
    First time i read some information about SSD's, i was surprised to learn they internally use NAND flash chips. This kind of memory is generally slow (low bandwidth) and have high latency while SSD's are just the opposite. But here is how it works : SSD drives increase their bandwidth by using several NAND flash chips in parallel. In other words, they do some data striping (aka RAID0) across several chips (done by the controller). What i don't understand is how SSD's drives managed to reduce latency? (or at least lot better than what a single NAND chip without any controller can do)

    Read the article

  • Home router for higher network loads

    - by Zizzencs
    I'm trying to find a replacement for my ASUS WL500GP router. Some constraints: my ISP now provides 120 Mbit bandwidth - the router must be able to handle it I have 3 computers and a laptop at home - the router must have at least 4 LAN ports and the WAN port I need to be able to do the following in parallel: use bittorrent with ~1000 connections transfer files from one home computer to another at high speed access one home computer from my workplace I don't really need wireless I don't really want two boxes to serve all these purposes. So I was searching for a router with 1 Gbit ports, jumbo frame support and good reviews, but haven't really found a promising candidate so far. Any suggestions?

    Read the article

  • Clarification of the difference between PCI memory addressing and I/O addressing?

    - by KevinM
    Could someone please clarify the difference between memory and I/O addresses on the PCI/PCIe bus? I understand that I/O addresses are 32-bit, limited to the range 0 to 4GB, and do not map onto system memory (RAM), and that memory addresses are either 32-bit or 64-bit. I get the impression that memory addressing must map onto available RAM, is this true? That if a PCI device wishes to transfer data to a memory address, that address must exist in actual system RAM (and is allocated during PCI configuration) and not virtual memory. So if a PCI device only needs to transfer a small amount of data at a time, where there is no advantage to putting it into RAM or using DMA, then I/O addressing is fine (e.g. a parallel port implemented on a PCI card). And why do I keep reading that PCI/PCIe I/O addressing is being deprecated in favour of memory addressing? Thanks!

    Read the article

  • Windows 7 doesn't recognize DVD drive

    - by adrianboimvaser
    I reinstalled Windows 7 (x64) because DVD drive stoped being recognized. At first it worked, I even got to burn a dvd. Some time later it misteriously dissapeared again. This question has been asked and answered many times on the Internet, but no proposed solution works for me. Please help! Any clues are gratefuly accepted! EDIT: I booted a live CD and succesfuly tried to burn a DVD. When I rebooted back to windows 7 the DVD drive was there. I burnt another DVD and it also works... I don't know how much it's gonna last... The drive is a TSSTcorp CDDVDW TS-L632H. FW Revision: HS02. It's parallel ATA.

    Read the article

  • Should I store my code/projects on my SSD or my secondary drive?

    - by user37467
    I just got a new box. It has an SSD for the primary drive, and a 1TB SATA for the secondary drive. I'm going to run windows and my binaries on the SSD and keep all my downloads/documents/music/etc on the secondary drive. My question is should I also keep my Visual Studio Projects and code on the SSD or keep them on the secondary drive? The faster SSD would presumably be better for compiling and indexed searches, but would it be better to keep it on the 2nd drive for a more parallel disk IO situation?

    Read the article

  • Is there any gmap's api function to concatenate address string from AddressDetails structure?

    - by Vadim
    Hello! I’am using Google Map’s GClientGeocoder for reversing map coordinates into string address. Exactly as shown in google’s example here http://code.google.com/apis/ajax/playground/?exp=maps#geocoding_reverse But, I would like to extract LocalityName (place.AddressDetails.Country.AdministrativeArea.Locality.LocalityName) from place.address. The straight way will be join all AddressDetails elements, excluding LocalityName. However order of the structure elements in final string representation is depends from geographical location. For example: Order for Australia city: ThoroughfareName + “, ” + LocalityName + “ ” + AdministrativeAreaName + “ ” + PostalCodeNumber + “, ” + CountryName Order for Russian city: CountryName + “, ” + PostalCodeNumber + “, ” + LocalityName + “, ” +ThoroughfareName Moreover PostalCodeNumber was not supplied in AddressDetails for the last example. Please, help!

    Read the article

  • How to fix basicHttpBinding in WCF when using multiple proxy clients?

    - by Hemant
    [Question seems a little long but please have patience. It has sample source to explain the problem.] Consider following code which is essentially a WCF host: [ServiceContract (Namespace = "http://www.mightycalc.com")] interface ICalculator { [OperationContract] int Add (int aNum1, int aNum2); } [ServiceBehavior (InstanceContextMode = InstanceContextMode.PerCall)] class Calculator: ICalculator { public int Add (int aNum1, int aNum2) { Thread.Sleep (2000); //Simulate a lengthy operation return aNum1 + aNum2; } } class Program { static void Main (string[] args) { try { using (var serviceHost = new ServiceHost (typeof (Calculator))) { var httpBinding = new BasicHttpBinding (BasicHttpSecurityMode.None); serviceHost.AddServiceEndpoint (typeof (ICalculator), httpBinding, "http://172.16.9.191:2221/calc"); serviceHost.Open (); Console.WriteLine ("Service is running. ENJOY!!!"); Console.WriteLine ("Type 'stop' and hit enter to stop the service."); Console.ReadLine (); if (serviceHost.State == CommunicationState.Opened) serviceHost.Close (); } } catch (Exception e) { Console.WriteLine (e); Console.ReadLine (); } } } Also the WCF client program is: class Program { static int COUNT = 0; static Timer timer = null; static void Main (string[] args) { var threads = new Thread[10]; for (int i = 0; i < threads.Length; i++) { threads[i] = new Thread (Calculate); threads[i].Start (null); } timer = new Timer (o => Console.WriteLine ("Count: {0}", COUNT), null, 1000, 1000); Console.ReadLine (); timer.Dispose (); } static void Calculate (object state) { var c = new CalculatorClient ("BasicHttpBinding_ICalculator"); c.Open (); while (true) { try { var sum = c.Add (2, 3); Interlocked.Increment (ref COUNT); } catch (Exception ex) { Console.WriteLine ("Error on thread {0}: {1}", Thread.CurrentThread.Name, ex.GetType ()); break; } } c.Close (); } } Basically, I am creating 10 proxy clients and then repeatedly calling Add service method on separate threads. Now if I run both applications and observe opened TCP connections using netstat, I find that: If both client and server are running on same machine, number of tcp connections are equal to number of proxy objects. It means all requests are being served in parallel. Which is good. If I run server on a separate machine, I observed that maximum 2 TCP connections are opened regardless of the number of proxy objects I create. Only 2 requests run in parallel. It hurts the processing speed badly. If I switch to net.tcp binding, everything works fine (a separate TCP connection for each proxy object even if they are running on different machines). I am very confused and unable to make the basicHttpBinding use more TCP connections. I know it is a long question, but please help!

    Read the article

  • NoSQL vs Ehcache caching advise for speeding up read only mysql Database

    - by paddydub
    I'm building a Route Planner Webapp using Spring/Hibernate/Tomcat and a mysql database, I have a database containing read only data, such as Bus Stop Coordinates, Bus times which is never updated. I'm trying to make the app run faster, each time the application is run it will preform approx 1000 reads to the database to calculate a route. I have setup a Ehcache which greatly improves the read from database times. I'm now setting terracotta + Ehcache distributed caching to share the cache with multiple Tomcat JVMs. This seems a bit complicated. I've tried memcached but it was not performing as fast as ehcache. I'm wondering if a MongoDb or Redis would be better suited. I have no experience with nosql but I would appreciate if anyone has any ideas. What i need is quick access to the read only database.

    Read the article

  • How to draw an UIImageView in -drawRect:?

    - by mystify
    I'm trying since 5 hours now: - (void)drawRect:(CGRect)rect { // flip the wrong coordinate system CGContextTranslateCTM(context, 0.0f, rect.size.height); //shift the origin up CGContextScaleCTM(context, 1.0f, -1.0f); //flip the y-axis CGContextDrawImage(context, myImgView.frame, myImageView.image.CGImage); } The problem: While the image draws correctly, the coordinates specified by the UIImageView frame are completely useless. The image appears placed completely wrong on screen. I guess I must also flip the CGRect of the UIImageView? But how?

    Read the article

  • C# ListView Detail, Highlight a single cell

    - by Mike Christiansen
    Hello, I'm using a ListView in C# to make a grid. I would like to find out a way to be able to highlight a specific cell, programatically. I only need to highlight one cell. I've experimented with Owner Drawn subitems, but using the below code, I get highlighted cells, but no text! Are there any ideas on how to get this working? Thanks for your help. //m_PC.Location is the X,Y coordinates of the highlighted cell. void listView1_DrawSubItem(object sender, DrawListViewSubItemEventArgs e) { if ((e.ItemIndex == m_PC.Location.Y) && (e.Item.SubItems.IndexOf(e.SubItem) == m_PC.Location.X)) e.SubItem.BackColor = Color.Blue; else e.SubItem.BackColor = Color.White; e.DrawBackground(); e.DrawText(); }

    Read the article

  • Transparent View with Android

    - by victorusmo
    Hi everybody, I try to have a bitmap moving over my android application. I m be able to have my bitmap behind my text view, but not over them. public void onCreate(Bundle savedInstanceState) ... // ll is a FrameLayout ll.addView(text1); ll.addView(text2); ll.addView(new Panel(this),200,400); my Panel class is defined like this : class Panel extends SurfaceView ...... @Override public void onDraw(Canvas canvas) { canvas.drawColor(0, PorterDuff.Mode.CLEAR); Bitmap bitmap; GraphicObject.Coordinates coords; for (GraphicObject graphic : _graphics) { bitmap = graphic.getGraphic(); coords = graphic.getCoordinates(); canvas.drawBitmap(bitmap, coords.getX(), coords.getY(), null); } } Can you help me ? How Can i Draw a transparent bitmap over my views of my application Thanks a lot, Cheers, Victor

    Read the article

  • How do I change the frame position for a custom MKAnnotationView?

    - by andrei
    I am trying to make a custom annotation view by subclassing MKAnnotationView and overriding the drawRect method. I want the view to be drawn offset from the annotation's position, somewhat like MKPinAnnotationView does it, so that the point of the pin is at the specified coordinates, rather than the middle of the pin. So I set the frame position and size as shown below. However, it doesn't look like I can affect the position of the frame at all, only the size. The image ends up being drawn centered over the annotation position. Any tips on how to achieve what I want? MyAnnotationView.h: @interface MyAnnotationView : MKAnnotationView { } MyAnnotationView.m: - (id)initWithAnnotation:(id <MKAnnotation>)annotation reuseIdentifier:(NSString *)reuseIdentifier { if (self = [super initWithAnnotation:annotation reuseIdentifier:reuseIdentifier]) { self.canShowCallout = YES; self.backgroundColor = [UIColor clearColor]; // Position the frame so that the bottom of it touches the annotation position self.frame = CGRectMake(0, -16, 32, 32); } return self; } - (void)drawRect:(CGRect)rect { [[UIImage imageNamed:@"blue-dot.png"] drawInRect:rect]; }

    Read the article

  • How to geocode a large number of addresses?

    - by user308569
    I need to geocode, i.e. translate street address to latitude,longitude for ~8,000 street addresses. I am using both Yahoo and Google geocoding engines at http://www.gpsvisualizer.com/geocoder/, and found out that for a large number of addresses those engines (one of them or both) either could not perform geocoding (i.e.return latitude=0,longitude=0), or return the wrong coordinates (incl. cases when Yahoo and Google give different results). What is the best way to handle this problem? Which engine is (usually) more accurate? I would appreciate any thoughts, suggestions, ideas from people who had previous experience with this kind of task.

    Read the article

  • iPhone hitTest broken after rotation

    - by Adam
    Hi all, I have a UIView that contains a number of CALayer subclasses. I am using the following code to detect which layer a touch event corresponds to: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; CGPoint touchPoint = [touch locationInView:self]; NSLog(@"%@,%@",NSStringFromCGPoint(point),[self.layer hitTest:point].name); } This works fine until the device is rotated. When the device is rotated all current layers are removed from the superlayer, and new CALayers are created to fit the new orientation. The new layers are correctly inserted and viewable in the correct orientation. After the rotation the hitTest method consistently returns nil when clearly clicking on the newly created layers and registers for locations of layers which are incorrect. The coordinates of the hit test are correct, but no layers are found. Am I missing a function call or something after handling the rotation? Cheers, Adam

    Read the article

  • MongoDb vs Ehcache caching advise for speeding up read only mysql Database

    - by paddydub
    I'm building a Route Planner Webapp using Spring/Hibernate/Tomcat and a mysql database, I have a database containing read only data, such as Bus Stop Coordinates, Bus times which is never updated. I'm trying to make the app run faster, each time the application is run it will preform approx 1000 reads to the database to calculate a route. I have setup a Ehcache which greatly improves the read from database times. I'm now setting terracotta + Ehcache distributed caching to share the cache with multiple Tomcat JVMs. This seems a bit complicated. I've tried memcached but it was not performing as fast as ehcache. I'm wondering if a MongoDb would be better suited. I have no experience with nosql but I would appreciate if anyone has any ideas. All i need is quick access to the read only database.

    Read the article

  • Reverse geocode without using MKReverseGeocoder

    - by SpH1nX
    Hi guys, I'm trying to detect current user address using MKReverseGeocoder passing coordinates obtained via CLLocation class. Reading MKReverseGeocoder Class Reference I noticed that The Google terms of service require that the reverse geocoding service be used in conjunction with a Google map; take this into account when designing your application's user interface. so I'm wondering if (and eventually how) can I reverse geocode user current location on iPhone OS SDK 3.1.3. I thought using Google Maps API but the EULA has the same obligation. Yahoo Maps API is even worse and Microsoft one aren't free.

    Read the article

  • Compute bounding quad of a sphere with vertex shader

    - by Ben Jones
    I'm trying to implement an algorithm from a graphics paper and part of the algorithm is rendering spheres of known radius to a buffer. They say that they render the spheres by computing the location and size in a vertex shader and then doing appropriate shading in a fragment shader. Any guesses as to how they actually did this? The position and radius are known in world coordinates and the projection is perspective. Does that mean that the sphere will be projected as a circle? Thanks!

    Read the article

  • Correct handling of OnMouseWheel events in Ext-GWT

    - by Kevin Loney
    I'm trying to figure out which property of BoxComponentEvent will tell me if the generated OnMouseWheel event was a scroll-up or scroll-down event. I have output the values of all the properties BoxComponentEvent exposes; and all of them (with the exception of the coordinates at which the event took place) stay the same regardless. Google and the Ext-GWT docs have been pretty useless for providing a concrete example. public class MyPanel extends ContentPanel { // ... public MyPanel() { addListener(Events.OnMouseWheel, new Listener<BoxComponentEvent>() { public void handleEvent(BoxComponentEvent be) { // What happens here to distinguish scroll-up and scroll-down? } }); } protected void afterRender() { super.afterRender(); el().addEventsSunk(Events.OnMouseWheel.getEventCode()); } // ... }

    Read the article

  • Best practice to display POI in iPhone's MapKit?

    - by iamj4de
    Assuming I have a database of POI with their respective coordinates (longitude & latitude). What would be the "standard" way to display the POI as annotations around the user's current location? To elaborate: Given a zoom level, I guess I have to search through the database for all POI whose distance to the current location < a certain threshold, then create annotations for them. Or is there any smarter way? If the user zooms in/out, moves the map... I will need to redo the whole thing again? It seems that MapKit has a mechanism to cache/reuse annotations. Should I create a lot of them right away and let MapKit decides what to render when the visible region changes? I guess this would make the transition smoother, but also consumes more memory. What is your experience with this? Thanks.

    Read the article

  • OpenNETCF Signature control question

    - by Vaccano
    I am using the Signature control in OpenNETCF. It works great for most everything I need. However, I need a way invert the signature and load it back in. It has a call to get the "bytes" for the signature (GetSignatureEx()). It returns a byte[] of the signature. This signature can then be loaded back in with LoadSignatureEx(). I can't seem to figure out the system for these bytes. I thought they may be coordinates, but it does not seem so now. If anyone out there knows a way to invert the signature and load it back in, I would be grateful to hear it.

    Read the article

  • OPENGLES 2.0 equivalent of glorthof?

    - by Zippo
    Hi Guys, In my iphone app, i need to project 3d scene into the 2D coordinates of the screen for some calculations. My objects go through various rotations, translations and scaling. So i figured i need to multiply the vertices with ModelView matrix first, then i need to multiply it with the Orthogonal projection matrix. First of all am on the right track? I have the Model View Matrix, but need the projection matrix. Is there a glorthof equivalent in ES 2.0? PS: i am new to opengl. Thanks for your help. Zippo

    Read the article

  • How do i translate movement on the Canvas3D to movement in the virtual 3D world

    - by Coder
    My goal is to move a shape in the virtual world in such a way so that it ends up where the mouse pointer is on the canvas. What i have: -mouse position (x,y) on a Canvas3D object -Point3d object of where a pick ray starting from the Canvas3D viewport intersects with the first scene object. (point in 3D space of where i want to start the drag) What i want: -Some way to translate the Point3d's coordinates so that the initial point of intersection (the Point3d object) is always overlapping the the mouse position on the canvas (same as when i used the pick ray to determine what the user clicked on from the Canvas3D object). Thanks!

    Read the article

  • Interpolating 2d data that is piecewise constant on faces

    - by celil
    I have an irregular mesh which is described by two variables - a faces array that stores the indices of the vertices that constitute each face, and a verts array that stores the coordinates of each vertex. I also have a function that is assumed to be piecewise constant over each face, and it is stored in the form of an array of values per face. I am looking for a way to construct a function f from this data. Something along the following lines: faces = [[0,1,2], [1,2,3], [2,3,4] ...] verts = [[0,0], [0,1], [1,0], [1,1],....] vals = [0.0, 1.0, 0.5, 3.0,....] f = interpolate(faces, verts, vals) f(0.2, 0.2) = 0.0 # point inside face [0,1,2] f(0.6, 0.6) = 1.0 # point inside face [1,2,3] The manual way of evaluating f(x,y) would be to find the corresponding face that the point x,y lies in, and return the value that is stored in that face. Is there a function that already implements this in scipy (or in matlab)?

    Read the article

  • using WP_Query with custom SQL in wordpress

    - by Matt Facer
    Hi. I am writing a plugin for wordpress and I want to create my own search. I have tried to alter the wordpress search, but what I am doing is very specific with the SQL query. I am comparing lat and long coordinates and getting posts based on that. I can display posts by using the standard wpdb query, but then I don't get the other features like paging. I'd like to be able to use my SQL statement with the WP_Query function. If I'm right in thinking, I should then be able to use the paging and other features which come from the $posts global variable. Is this right?? I've googled for hours but can't find anything for plugins outside of using args to select categories etc. I simply need to send a complete SQL command - nothing else. Many thanks....

    Read the article

  • PHP extract GPS EXIF data

    - by Kami
    I would like to extract the GPS EXIF tag from pictures using php. I'm using the exif_read_data() that returns a array of all tags + data : GPS.GPSLatitudeRef: N GPS.GPSLatitude:Array ( [0] => 46/1 [1] => 5403/100 [2] => 0/1 ) GPS.GPSLongitudeRef: E GPS.GPSLongitude:Array ( [0] => 7/1 [1] => 880/100 [2] => 0/1 ) GPS.GPSAltitudeRef: GPS.GPSAltitude: 634/1 I don't know how to interpret 46/1 5403/100 and 0/1 ? 46 might be 46° but what about the rest especially 0/1 ? angle/1 5403/100 0/1 What is this structure about ? How to convert them to "standard" ones (like 46°56'48?N 7°26'39?E from wikipedia) ? I would like to pass thoses coordinates to the google maps api to display the pictures positions on a map !

    Read the article

< Previous Page | 76 77 78 79 80 81 82 83 84 85 86 87  | Next Page >