Search Results

Search found 32219 results on 1289 pages for 'screen size'.

Page 74/1289 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • Touch Screen Running Windows CE

    - by Jed
    I'm starting my first project that runs on a 7 inch touch screen running Windows CE 6.0 (and NETCF 3.5). The touch screen doesn't respond to touch too well when I use my finger. The only way for me to navigate around is by using a stylus (or similar). Since I've never worked with Windows CE or a resistive touch screen, I'm not sure if I should expect to be able to use my finger or if the stylus method is, essentially, the only way to effectively navigate around. - or, maybe, I have a touch screen that simply isn't that good. If you have experience with WinCE running on a touch screen, do you find that a stylus is the only way to go?

    Read the article

  • Shell Screen -X Stuff problems

    - by user1621988
    OPTIONS="java -Xms1024M -Xmx1024M -jar craftbukkit.jar" PROCESS=server01 screen -dmS $PROCESS $OPTIONS nogui # Starting the application screen -x $PROCESS -X stuff `printf "stop\r"` # Closing the application screen -x $PROCESS # Attaching to the terminal of the application Starting the application work fine, however I got problems with stuff 'printf "stop/r"' It seems not to work when I just start up, wait some time and then try to Stop it with command above. But the strange thing is, that IF I did screen -x $PROCESS and detach (ctrl-A & ctrl-D) and then I use the Stop command it does work. So is there a way around to stuff printf without -screen -x the $PROCESS? Thank You in Advance

    Read the article

  • GWT - Retrieve size of a widget that is not displayed

    - by Garagos
    I need to set the size of an absolutePanel regarding to its child size, but the getOffset* methods return 0 because (i think) the child as not been displayed yet. A Quick example: AbsolutePanel aPanel = new AbsolutePanel(); HTML text = new HTML(/*variable lenght text*/); int xPosition = 20; // actually variable aPanel.add(text, xPosition, 0); aPanel.setSize(xPosition + text .getOffsetWidth() + "px", "50px"); // 20px 50px I could also solve my problem by using the AbsolutePanel size to set the child position and size: AbsolutePanel aPanel = new AbsolutePanel(); aPanel.setSize("100%", "50px"); HTML text = new HTML(/*variable lenght text*/); int xPosition = aPanel.getOffsetWidth() / 3; // Once again, getOffsetWidth() returns 0; aPanel.add(text, xPosition, 0); In both case, i have to find a way to either: retrieve the size of a widget that has not been displayed be notified when a widget is displayed

    Read the article

  • External File Upload Optimizations for Windows Azure

    - by rgillen
    [Cross posted from here: http://rob.gillenfamily.net/post/External-File-Upload-Optimizations-for-Windows-Azure.aspx] I’m wrapping up a bit of the work we’ve been doing on data movement optimizations for cloud computing and the latest set of data yielded some interesting points I thought I’d share. The work done here is not really rocket science but may, in some ways, be slightly counter-intuitive and therefore seemed worthy of posting. Summary: for those who don’t like to read detailed posts or don’t have time, the synopsis is that if you are uploading data to Azure, block your data (even down to 1MB) and upload in parallel. Set your block size based on your source file size, but if you must choose a fixed value, use 1MB. Following the above will result in significant performance gains… upwards of 10x-24x and a reduction in overall file transfer time of upwards of 90% (eg, uploading a 1GB file averaged 46.37 minutes prior to optimizations and averaged 1.86 minutes afterwards). Detail: For those of you who want more detail, or think that the claims at the end of the preceding paragraph are over-reaching, what follows is information and code supporting these claims. As the title would indicate, these tests were run from our research facility pointing to the Azure cloud (specifically US North Central as it is physically closest to us) and do not represent intra-cloud results… we have performed intra-cloud tests and the overall results are similar in notion but the data rates are significantly different as well as the tipping points for the various block sizes… this will be detailed separately). We started by building a very simple console application that would loop through a directory and upload each file to Azure storage. This application used the shipping storage client library from the 1.1 version of the azure tools. The only real variation from the client library is that we added code to collect and record the duration (in ms) and size (in bytes) for each file transferred. The code is available here. We then created a directory that had a collection of files for the following sizes: 2KB, 32KB, 64KB, 128KB, 512KB, 1MB, 5MB, 10MB, 25MB, 50MB, 100MB, 250MB, 500MB, 750MB, and 1GB (50 files for each size listed). These files contained randomly-generated binary data and do not benefit from compression (a separate discussion topic). Our file generation tool is available here. The baseline was established by running the application described above against the directory containing all of the data files. This application uploads the files in a random order so as to avoid transferring all of the files of a given size sequentially and thereby spreading the affects of periodic Internet delays across the collection of results.  We then ran some scripts to split the resulting data and generate some reports. The raw data collected for our non-optimized tests is available via the links in the Related Resources section at the bottom of this post. For each file size, we calculated the average upload time (and standard deviation) and the average transfer rate (and standard deviation). As you likely are aware, transferring data across the Internet is susceptible to many transient delays which can cause anomalies in the resulting data. It is for this reason that we randomized the order of source file processing as well as executed the tests 50x for each file size. We expect that these steps will yield a sufficiently balanced set of results. Once the baseline was collected and analyzed, we updated the test harness application with some methods to split the source file into user-defined block sizes and then to upload those blocks in parallel (using the PutBlock() method of Azure storage). The parallelization was handled by simply relying on the Parallel Extensions to .NET to provide a Parallel.For loop (see linked source for specific implementation details in Program.cs, line 173 and following… less than 100 lines total). Once all of the blocks were uploaded, we called PutBlockList() to assemble/commit the file in Azure storage. For each block transferred, the MD5 was calculated and sent ensuring that the bits that arrived matched was was intended. The timer for the blocked/parallelized transfer method wraps the entire process (source file splitting, block transfer, MD5 validation, file committal). A diagram of the process is as follows: We then tested the affects of blocking & parallelizing the transfers by running the updated application against the same source set and did a parameter sweep on the block size including 256KB, 512KB, 1MB, 2MB, and 4MB (our assumption was that anything lower than 256KB wasn’t worth the trouble and 4MB is the maximum size of a block supported by Azure). The raw data for the parallel tests is available via the links in the Related Resources section at the bottom of this post. This data was processed and then compared against the single-threaded / non-optimized transfer numbers and the results were encouraging. The Excel version of the results is available here. Two semi-obvious points need to be made prior to reviewing the data. The first is that if the block size is larger than the source file size you will end up with a “negative optimization” due to the overhead of attempting to block and parallelize. The second is that as the files get smaller, the clock-time cost of blocking and parallelizing (overhead) is more apparent and can tend towards negative optimizations. For this reason (and is supported in the raw data provided in the linked worksheet) the charts and dialog below ignore source file sizes less than 1MB. (click chart for full size image) The chart above illustrates some interesting points about the results: When the block size is smaller than the source file, performance increases but as the block size approaches and then passes the source file size, you see decreasing benefit to the point of negative gains (see the values for the 1MB file size) For some of the moderately-sized source files, small blocks (256KB) are best As the size of the source file gets larger (see values for 50MB and up), the smallest block size is not the most efficient (presumably due, at least in part, to the increased number of blocks, increased number of individual transfer requests, and reassembly/committal costs). Once you pass the 250MB source file size, the difference in rate for 1MB to 4MB blocks is more-or-less constant The 1MB block size gives the best average improvement (~16x) but the optimal approach would be to vary the block size based on the size of the source file.    (click chart for full size image) The above is another view of the same data as the prior chart just with the axis changed (x-axis represents file size and plotted data shows improvement by block size). It again highlights the fact that the 1MB block size is probably the best overall size but highlights the benefits of some of the other block sizes at different source file sizes. This last chart shows the change in total duration of the file uploads based on different block sizes for the source file sizes. Nothing really new here other than this view of the data highlights the negative affects of poorly choosing a block size for smaller files.   Summary What we have found so far is that blocking your file uploads and uploading them in parallel results in significant performance improvements. Further, utilizing extension methods and the Task Parallel Library (.NET 4.0) make short work of altering the shipping client library to provide this functionality while minimizing the amount of change to existing applications that might be using the client library for other interactions.   Related Resources Source code for upload test application Source code for random file generator ODatas feed of raw data from non-optimized transfer tests Experiment Metadata Experiment Datasets 2KB Uploads 32KB Uploads 64KB Uploads 128KB Uploads 256KB Uploads 512KB Uploads 1MB Uploads 5MB Uploads 10MB Uploads 25MB Uploads 50MB Uploads 100MB Uploads 250MB Uploads 500MB Uploads 750MB Uploads 1GB Uploads Raw Data OData feeds of raw data from blocked/parallelized transfer tests Experiment Metadata Experiment Datasets Raw Data 256KB Blocks 512KB Blocks 1MB Blocks 2MB Blocks 4MB Blocks Excel worksheet showing summarizations and comparisons

    Read the article

  • Is big (as much as big) size display (Monitor) always better for Development?

    - by Jitendra Vyas
    Is bigger size display ( Monitor) always better for Development? I'm going to buy a new LCD Monitor. I mostly work in Adobe Photoshop, HTML, CSS, jQuery and Wordpress. Budget is not a problem. Many options are there for LCD Monitor SIZE My questions are Would it better for maximum size, or large size monitor are not good always? Would it better to buy 21.5 inch x 2 than one 30 inch monitor? Which monitor size would you would prefer between the size of 21.5 inch - 30 inch, if bugdet is not a problem?

    Read the article

  • Ubuntu 13.04 boot into black screen, even after installing nvidia drivers, fail at: "Starting Reload cups, upon starting avahi-daemon..."

    - by Elad92
    I got a new machine with i7 4770k and gefore gtx660. I installed windows and then installed Ubuntu 13.04. In the installation everything went well, after the installation I again boot from the USB and choose try ubuntu, and installed boot-repair because windows automatically boot with no option for ubuntu. After I repaired the boot, I restarted and got into grub, when I chose Ubuntu I got message saying I'm running on low graphics mode and when I pressed ok the screen turned off. I couldn't do anything so I restarted and after reading this post: My computer boots to a black screen, what options do I have to fix it? I got into recovery, selected dpkg and repaired packages. Then I rebooted, and tried to press 'e' and change quiet splash to no splash and nomodeset but unfortunately both of them didn't work, and not the screen didn't get turned off, but I just saw a blank screen. So I rebooted again, entered the recovery, and this time I went to the root shell, and tried to install nvidia drivers from this guide here: http://www.howopensource.com/2012/10/install-nvidia-geforce-driver-in-ubuntu-12-10-12-04-using-ppa/ I rebooted and got the same black screen again (the screen didn't go off, just saw a blank screen). In the grub I see that the kernel is 3.8.0-25. I also checked that the usb files are not corrupted using the check CD from the ubuntu installation screen. I'm really frustrated, I don't know what else can I do. (If someone also knows how can I connect to wifi using the root shell it will be very appreciated, because when I choose the 'enable networking' it again booting me to a black screen. Thanks Edit After digging more, I again boot with nomodeset and saw where is it failing, this is the lines I saw: * Starting Reload cups, upon starting avahi-daemon to make sure remote queues are populated [OK] * Starting Reload cups, upon starting avahi-daemon to make sure remote queues are populated [fail] I searched for it in google and this is the closest result I got: http://ubuntuforums.org/showthread.php?t=2144261 i didn't find any way to solve it in the internet, from what I understand this is a problem with 13.04. If someone knows how to fix it I will be very grateful. Thanks Edit 2 - 23.6.13 As Mitch suggested, I disabled the avahi-daemon, and when I boot in nomodeset I get the following error: What else can be the problem?

    Read the article

  • How to determine the size of a package in terminal prior to downloading?

    - by user14590
    When using apt-get install <package_name>, and there are dependencies that need to be downloaded, the terminal outputs names of additional packages and total size, and asks for confirmation before downloading. But, when dependencies are satisfied and nothing but the named package needs to be downloaded there is no size output and no confirmation. When using Synaptic, I can see the total size that new packages that will use after installation but no way to see the size that needs to be downloaded, except to go from package to package and use properties to see the compressed size. I would like to know if there is a way to see the size of a package(s) in terminal and Synaptic prior to downloading and installing it/them?

    Read the article

  • Hide usernames shown on Windows Server 2008 Remote Desktop login screen

    - by user38553
    When I remote desktop to my Windows Server 2008 (a hosted virtual server) I see a login screen showing an icon for each user in the system. I can click on a user then enter a password and login. This is a terrible security oversight in my opinion as it gives anyone that might want to compromise my server a full list of valid usernames. Is there a way to revert to the old style of login screen requiring both username and password? Thanks

    Read the article

  • how to lock screen for Mac?

    - by George2
    Hello everyone, I am using a MacBook Pro running Mac OS X 10.5. I am new to this development environment, and previously worked on Windows. I am wondering how to lock screen for Mac computer, like Windows Key + D to lock screen for Windows PC? thanks in avdance, George

    Read the article

  • Display Driver stopped responding and recovered successfully- Blue Screen - HP Elite Book

    - by Fahad Saleem
    I have HP ELiteBook , this problem has become a routine headache for me (after every 20, 25 minutes_ - While using, suddenly everything hangs and black out goes around and then a message pop at the task bar below that display driver stopped ...and successfully recovered but the hangs and black screen continues and then the Blue Death Screen comes" and then i have no other option but to restart Is this a driver problem ? Hardware problem ? what should I do Please be specific , no stories and theories

    Read the article

  • Reverse Screen Share: Mac Snow Leopard

    - by Shyam
    Hi, I am currently enjoying the powerful features of screen sharing. I was wondering though, if it would be possible to share my screen with another Mac. I have a Macbook Pro that can connect to a Mac mini (which in its turn is connected to a flatscreen). Is this possible and what must I do to achieve this?

    Read the article

  • Windows 7 crashes with a blue screen

    - by Chinook pilot
    I am running win 7 32 bit on a Gateway Media Center. Without any interaction from me, and the only thing on the screen is my screen saver it crashes and the fault is "Stop" and a long string of numbers. I have had a "adapi sys message but now it seems to be this "Stop' MESSAGE. Any help will be appreciated.

    Read the article

  • RDC and black screen mystery

    - by Vidar
    The following are all Windows 7 64 bit Business edition computers. I have a PC (lets call it "PC1") that I remotely access now and again using Remote Desktop Connection (RDC) on my laptop downstairs. Sometimes when I physically go back to sit at PC1, the screen is black and there is no way for me to wake it into life, no login screen or anything - the PC is still on and I am always forced to do a hard reboot. What can I do to stop this happening, it's really annoying.

    Read the article

  • virtual mac osx 10.6.8 in VMWare does not save screen captures

    - by epeleg
    I have a VMWare image of a mac OSX 10.6.8 (fully updated). When I click Commnd+Shift+3 it makes a camera shutter sound, but no screen-capture is saved anywhere that I can find. When running: defaults read com.apple.screencapture location it returns /Users/admin/Pictures/Captures this folder exists and is empty also executed chmod 777 /Users/admin/Pictures/Captures Any ideas anyone ? Could this be related to the VMware screen resolution(Size) of this MAC? (currently set to 1348x1391)

    Read the article

  • How to record screen and sound

    - by user23950
    Do you know of any application that can record both sound and the screen. Camstudio records only the screen but not the sound. Coming out from the monitor. How do I do this one, because I frequently stream online content. So that I'll just record it and I will have a copy of that stream

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >