Search Results

Search found 72722 results on 2909 pages for 'file processing'.

Page 134/2909 | < Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >

  • Using PHP GD to create image form text with different fonts.

    - by Meredith
    I have been using this simple script to generate images from text: <?php header('Content-type: image/png'); $color = RgbfromHex($_GET['color']); $text = urldecode($_GET['text']); $font = 'arial.ttf'; $im = imagecreatetruecolor(400, 30); $bg_color = imagecolorallocate($im, 255, 255, 255); $font_color = imagecolorallocate($im, $color[0], $color[1], $color[2]); imagefilledrectangle($im, 0, 0, 399, 29, $bg_color); imagettftext($im, 20, 0, 10, 20, $font_color, $font, $text); imagepng($im); imagedestroy($im); function RgbfromHex($hexValue) { if(strlen(trim($hexValue))==6) { return array( hexdec(substr($hexValue,0,2)), // R hexdec(substr($hexValue,2,4)), // G hexdec(substr($hexValue,4,6)) // B ); } else return array(0, 0, 0); } ?> I call the script with file.php?text=testing script&color=000000 Now I'd like to know how could I generate text with normal and bold fonts mixed in the same image, something like file.php?text=testing <b>script</b>&color=000000

    Read the article

  • how to set the output image use com.android.camera.action.CROP

    - by adi.zean
    I have code to crop an image, like this : public void doCrop(){ Intent intent = new Intent("com.android.camera.action.CROP"); intent.setType("image/"); List<ResolveInfo> list = getPackageManager().queryIntentActivities(intent,0); int size = list.size(); if (size == 0 ){ Toast.makeText(this, "Cant find crop app").show(); return; } else{ intent.setData(selectImageUri); intent.putExtra("outputX", 300); intent.putExtra("outputY", 300); intent.putExtra("aspectX", 1); intent.putExtra("aspectY", 1); intent.putExtra("scale", true); intent.putExtra("return-data", true); if (size == 1) { Intent i = new Intent(intent); ResolveInfo res = list.get(0); i.setComponent(new ComponentName(res.activityInfo.packageName, res.activityInfo.name)); startActivityForResult(i, CROP_RESULT); } } } public void onActivityResult (int requestCode, int resultCode, Intent dara){ if (resultCode == RESULT_OK){ if (requestCode == CROP_RESULT){ Bundle extras = data.getExtras(); if (extras != null){ bmp = extras.getParcelable("data"); } File f = new File(selectImageUri.getPath()); if (f.exists()) f.delete(); Intent inten3 = new Intent(this, tabActivity.class); startActivity(inten3); } } } from what i have read, the code intent.putExtra("outputX", 300); intent.putExtra("outputY", 300); is use to set the resolution of crop result, but why i can't get the result image resolution higer than 300x300? when i set the intent.putExtra("outputX", 800); intent.putExtra("outputY", 800); the crop function has no result or crash, any idea for this situation? the log cat say "! ! ! FAILED BINDER TRANSACTION ! ! !

    Read the article

  • What hash algorithms are paralellizable? Optimizing the hashing of large files utilizing on mult-co

    - by DanO
    I'm interested in optimizing the hashing of some large files (optimizing wall clock time). The I/O has been optimized well enough already and the I/O device (local SSD) is only tapped at about 25% of capacity, while one of the CPU cores is completely maxed-out. I have more cores available, and in the future will likely have even more cores. So far I've only been able to tap into more cores if I happen to need multiple hashes of the same file, say an MD5 AND a SHA256 at the same time. I can use the same I/O stream to feed two or more hash algorithms, and I get the faster algorithms done for free (as far as wall clock time). As I understand most hash algorithms, each new bit changes the entire result, and it is inherently challenging/impossible to do in parallel. Are any of the mainstream hash algorithms parallelizable? Are there any non-mainstream hashes that are parallelizable (and that have at least a sample implementation available)? As future CPUs will trend toward more cores and a leveling off in clock speed, is there any way to improve the performance of file hashing? (other than liquid nitrogen cooled overclocking?) or is it inherently non-parallelizable?

    Read the article

  • What hash algorithms are parallelizable? Optimizing the hashing of large files utilizing on multi-co

    - by DanO
    I'm interested in optimizing the hashing of some large files (optimizing wall clock time). The I/O has been optimized well enough already and the I/O device (local SSD) is only tapped at about 25% of capacity, while one of the CPU cores is completely maxed-out. I have more cores available, and in the future will likely have even more cores. So far I've only been able to tap into more cores if I happen to need multiple hashes of the same file, say an MD5 AND a SHA256 at the same time. I can use the same I/O stream to feed two or more hash algorithms, and I get the faster algorithms done for free (as far as wall clock time). As I understand most hash algorithms, each new bit changes the entire result, and it is inherently challenging/impossible to do in parallel. Are any of the mainstream hash algorithms parallelizable? Are there any non-mainstream hashes that are parallelizable (and that have at least a sample implementation available)? As future CPUs will trend toward more cores and a leveling off in clock speed, is there any way to improve the performance of file hashing? (other than liquid nitrogen cooled overclocking?) or is it inherently non-parallelizable?

    Read the article

  • No Preview Images in File Open Dialogs on Windows 7

    - by Rick Strahl
    I’ve been updating some file uploader code in my photoalbum today and while I was working with the uploader I noticed that the File Open dialog using Silverlight that handles the file selections didn’t allow me to ever see an image preview for image files. It sure would be nice if I could preview the images I’m about to upload before selecting them from a list. Here’s what my list looked like: This is the Medium Icon view, but regardless of the views available including Content view only icons are showing up. Silverlight uses the standard Windows File Open Dialog so it uses all the same settings that apply to Explorer when displaying content. It turns out that the Customization options in particular are the problem here. Specifically the Always show icons, never thumbnails option: I had this option checked initially, because it’s one of the defenses against runaway random Explorer views that never stay set at my preferences. Alas, while this setting affects Explorer views apparently it also affects all dialog based views in the same way. Unchecking the option above brings back full thumbnailing for all content and icon views. Here’s the same Medium Icon view after turning the option off: which obviously works a whole lot better for selection of images. The bummer of this is that it’s not controllable at the dialog level – at least not in Silverlight. Dialogs obviously have different requirements than what you see in Explorer so the global configuration is a bit extreme especially when there are no overrides on the dialog interface. Certainly for Silverlight the ability to have previews is a key feature for many applications since it will be dealing with lots of media content most likely. Hope this helps somebody out. Thanks to Tim Heuer who helped me track this down on Twitter.© Rick Strahl, West Wind Technologies, 2005-2010Posted in Silverlight  Windows  

    Read the article

  • help! corrupt file recovery

    - by TheBumpper
    My supervisor computer crashed last night, and I'm trying to help him out. He made an R script but when he tried to open it, it was empty. But for some reason the file is 7.9kb so it should not be empty i think... anyway when i tried to open it, Gedit gave this error: "The file you opened has some invalid characters. If you continue editing this file you could corrupt this document. You can also choose another character encoding and try again." and the options to encode the characters. It looked like this(with a red background): \00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\00\ My question is, is there a way to restore the file? i hope someone has a brilliant idea

    Read the article

  • How to get the path of a file after publishing my game

    - by NDraskovic
    I made a "game" for a college project that reads data from .txt file at startup and draws some models according to the data in that file. This is the code I use using (StreamReader sr = new StreamReader(@"C:\Users\User\Desktop\Linije.txt")) { String linija; while ((linija = sr.ReadLine()) != null) { red = linija.Split(','); model = red[0]; x = red[1]; y = red[2]; z = red[3]; elementi.Add(Convert.ToInt32(model)); podatci.Add(new Vector3(Convert.ToSingle(x),Convert.ToSingle(y),Convert.ToSingle(z))); } } As you see, this code fills some variables that are then used to define the model that will be drawn and the coordinates where it will be drawn. The problem that I'm having is that I don't know how to distribute that file to other computers (obviously on another computer it would have another path)? Do you have some advices on how to do this? P.S I tried to put it in the Content and set the Build Action on None, and I can see the file in the content directory, but when I change it, nothing happens (the models don't change as they should)

    Read the article

  • 3DS Max 2012 OBJ file import missing polygons

    - by Vit
    I started learning OpenGL. I got to a point I want to import some "real" objects. After "Googling" I decided I will go with OBJ file for start, since it is simple to understand, and there are plenty of tutorials on how to read them properly. I have from university access to 3DS Max 2012. So I tried to create very simple model (just deformated cube) and exporting it using OBJ file, just to vertices and triangles for the moment, without textures, so I can examine its structure by myself. But if I imported it right back to 3DS from OBJ file, now it renders somewhat strange, like its smoothen, and with lightsource, even I have none in scene. But the geometry, its wireframe is intact. So I thought maybe it is problem of exporting only vertices and triangles so I downloaded Enterprise-D model from internet, exported with everything on (normals, textures everything), and again imported it. Now, some polygons are missing. So, I want to ask, am I doing something terribly wrong, or is there some incompatibility issue between .max and .obj file ? Even it is only simple textured model without any lightsources, animation etc.? Thanks. Edit: I tried objects with MeshLab, the first, deformated cube was absolutelly OK. But still bothers me that 3DS Max doesen´t render it properly. In Enterprise-D model, there are polygons missing even in MeshLab. I uploaded rar archive with .max model of Enterprise, same .obj model exported from 3DS, and obj model of deformated cube. Download here (2.5 MB, filesonic).

    Read the article

  • How do I create a .deb file?

    - by JamesTheAwesomeDude
    Yes, I know that this question has been asked many times before, but none of the answers really helped. I'd like to package the Minecraft launcher (which has no proprietary code, AFAIK,) into a .deb file so that I can put it on a flash drive and share it with my friends. I have managed to install Minecraft it manually (put some files into /opt/minecraft, download an icon, and create a .desktop file in /usr/share/applications,) and I have made a shell script that completely automates the process, but it relies on wget to retrieve a few files, including the .desktop file. (It isn't a self-extracting archive, after all.) I'd like to be able to do this offline, as a lot of my friends have slow or no internet. (One of their internet lines was buried so shallowly that it actually got knocked out by the lawnmower.) I won't be loading it into a PPA or anything like that; I just want it to be a "formal" package that can be easily installed and uninstalled. (One thing that I would like is for sudo apt-get purge minecraft to also remove the .minecraft folder. It would also be nice to define the dependedcies as being able to accept OpenJDK or Sun's JVM.) Oh, just so you know, the Minecraft launcher is a .jar file, but I can very, very easily launch it via shell scripts. The exact command is right on the download page.

    Read the article

  • How to make the run button run the project, not the file, in Eclipse

    - by Roy T.
    I'm using the Spring IDE, a variant of Eclipse to create a Java project. One big irritation I have is that when I press the run button Eclipse tries to run the current file, which usually fails because it doesn't have a main method. I've set up run configurations in the hope that would make the play button default to the run configuration instead of the current file, but that doesn't work either. Now to run my application correctly I have to press the little arrow next to play, select my favorite run configuration and then it works, this is only two extra clicks but it's tedious, the button is small and I feel like I shouldn't have to perform these extra steps. I mean what is the point of run configurations and projects if it still tries to run a file by default? Even more preferably I wouldn't even want to touch the mouse but just press Ctrl+F11, but this has the same behavior. All above applies to debugging as well btw. So my question is this: how do I make the run and debug buttons (and their short keys) default to the project's run configuration instead of to trying (and failing) to run only the current file? Much like it is in Visual Studio and other IDEs?

    Read the article

  • How to fix E: Internal Error, No file name for libc6

    - by Loren Ramly
    How to fix E: Internal Error, No file name for libc6, Like that will show If I do: $ sudo apt-get upgrade or $ sudo apt-get install package This is example : $ sudo apt-get upgrade Reading package lists... Done Building dependency tree Reading state information... Done The following packages have been kept back: ginn hplip hplip-data libdrm-dev libdrm-intel1 libdrm-nouveau1a libdrm-radeon1 libdrm2 libgrip0 libhpmud0 libkms1 libsane-hpaio libunity-2d-private0 libunity-core-5.0-5 linux-generic-pae linux-headers-generic-pae linux-image-generic-pae printer-driver-hpcups printer-driver-hpijs unity unity-2d-common unity-2d-panel unity-2d-shell unity-2d-spread unity-common unity-services The following packages will be upgraded: alsa-base firefox firefox-globalmenu firefox-gnome-support firefox-locale-en icedtea-6-jre-cacao icedtea-6-jre-jamvm icedtea-7-jre-jamvm libdbus-glib-1-2 libdbus-glib-1-dev libgnutls-dev libgnutls-openssl27 libgnutls26 libgnutlsxx27 libssl-dev libssl-doc libssl1.0.0 linux-sound-base openjdk-6-jre openjdk-6-jre-headless openjdk-6-jre-lib openjdk-7-jdk openjdk-7-jre openjdk-7-jre-headless openjdk-7-jre-lib openssl sudo 27 upgraded, 0 newly installed, 0 to remove and 26 not upgraded. 3 not fully installed or removed. Need to get 0 B/126 MB of archives. After this operation, 3,072 B of additional disk space will be used. Do you want to continue [Y/n]? y E: Internal Error, No file name for libc6 I have follow instruction from here E: Internal Error, No file name for libssl1.0.0 . Which do: sudo apt-get update sudo apt-get clean sudo apt-get install -fy sudo dpkg -i /var/cache/apt/archives/*.deb sudo dpkg --configure -a sudo apt-get install -fy sudo apt-get dist-upgrade But stuck with same error E: Internal Error, No file name for libc6 when do command sudo apt-get install -fy. And I've been looking on google, but have not been successful until now. Thanks.

    Read the article

  • Bulk Rename Tool is a Lightweight but Powerful File Renaming Tool

    - by Jason Fitzpatrick
    There’s no need to settle for overly simplistic file renaming tools as long as Bulk Rename Tool is around. It’s lightweight, insanely customizable, portable, and sure to make short work of any renaming task you throw at it. Bulk Rename Tool is a great portable application (available as an installed version if you crave context menu integration) that blasts through file renaming tasks. The main panel is intimidatingly packed with toggles and variables you can alter; this isn’t a one-click solution by any means. That said, once you get comfortable using the interface it’s lightening fast and extremely flexible. One tip that will save you an enormous amount of frustrating when you get started: make sure to highlight the files you want to change in the file preview window (located in the upper right corner) or else you won’t see the preview and won’t know if the changes you’re making in the control panel are yielding the file names you desire. Hit up the link below to read more and grab a copy; Bulk Rename Tool is free, Windows only. Bulk Rename Tool Latest Features How-To Geek ETC How To Make Disposable Sleeves for Your In-Ear Monitors Macs Don’t Make You Creative! So Why Do Artists Really Love Apple? MacX DVD Ripper Pro is Free for How-To Geek Readers (Time Limited!) HTG Explains: What’s a Solid State Drive and What Do I Need to Know? How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Bring the Grid to Your Desktop with the TRON Legacy Theme for Windows 7 The Dark Knight and Team Fortress 2 Mashup Movie Trailer [Video] Dirt Cheap DSLR Viewfinder Improves Outdoor DSLR LCD Visibility Lakeside Sunset in the Mountains [Wallpaper] Taskbar Meters Turn Your Taskbar into a System Resource Monitor Create Shortcuts for Your Favorite or Most Used Folders in Ubuntu

    Read the article

  • How to maintain symlinks in linux file manager?

    - by MountainX
    I want to use symlinks extensively. However, if I move the target file, the symlink becomes broken (unlike on Windows). That's not acceptable to me, so I either need a solution or I won't be able to use symlinks the way I wish to. Is there a solution that will work with Dolphin file manager? A command line solution is described on commandlinefu. In summary, it is something like one of these: lmv(){for a in ${@:1:$(expr $#-1)};do [ -e "$a" -a -e "${@:$#:1}" ] && mv "$a";"${@:$#:1}" && ln -s "${@:$#:1}"/"$(basename "$a")";"$(dirname "$a")";done} lmv(){for a in ${@:1:$(expr $#-1)};do [ -e "$a" -a -e "${@:$#}" ] && mv "$a";"${@:$#}" && ln -s "${@:$#}"/"$(basename "$a")";"$(dirname "$a")";done} But about half the time I'm using a file manager (Dolphin), so I need a complete solution to this problem. Is a solution available for a GUI file manager? EDIT: The context of this question is that I'm searching for an alternative to hardlinks. I previously asked this question about the pitfalls of hardlinks.

    Read the article

  • Parse text file on click and display

    - by John R
    I am thinking of a methodology for rapid retrieval of code snippets. I imagine an HTML table with a setup like this: one two ... ten one oneTwo() oneTen() two twoOne() twoTen() ... ten tenOne() tenTwo() When a user clicks a function in this HTML table, a snippet of code is shown in another div tag or perhaps a popup window (I'm open to different solutions). I want to maintain only one PHP file named utitlities.php that contains a class called 'util'. This file & class will hold all the functions referenced in the above table (it is also used on various projects and is functional code). A key idea is that I do not want to update the HTML documentation everytime I write/update a new function in utilities.php. I should be able to click a function in the table and have PHP open the utilities file, parse out the apropriate function and display it in an HTML window. Questions: 1) I will be coding this in PHP and JavaScript but am wondering if similar scripts are available (for all or part) so I don't reinvent the wheel. 2) Quick & easy Ajax suggestions appreciated too (probably will use jquery, but am rusty). 3) Methodology for parsing out the functions from the utilities.php file (I'm not to good with regex).

    Read the article

  • Extract all related class type aliasing and enum into one file or not

    - by Chen OT
    I have many models in my project, and some other classes just need the class declaration and pointer type aliasing. It does not need to know the class definition, so I don't want to include the model header file. I extract all the model's declaration into one file to let every classes reference one file. model_forward.h class Cat; typedef std::shared_ptr<Cat> CatPointerType; typedef std::shared_ptr<const Cat> CatConstPointerType; class Dog; typedef std::shared_ptr<Dog> DogPointerType; typedef std::shared_ptr<const Dog> DogConstPointerType; class Fish; typedef std::shared_ptr<Fish> FishPointerType; typedef std::shared_ptr<const Fish> FishConstPointerType; enum CatType{RED_CAT, YELLOW_CAT, GREEN_CAT, PURPLE_CAT} enum DogType{HATE_CAT_DOG, HUSKY, GOLDEN_RETRIEVER} enum FishType{SHARK, OCTOPUS, SALMON} Is it acceptable practice? Should I make every unit, which needs a class declaration, depends on one file? Does it cause high coupling? Or I should put these pointer type aliasing and enum definition inside the class back? cat.h class Cat { typedef std::shared_ptr<Cat> PointerType; typedef std::shared_ptr<const Cat> ConstPointerType; enum Type{RED_CAT, YELLOW_CAT, GREEN_CAT, PURPLE_CAT} ... }; dog.h class Dog { typedef std::shared_ptr<Dog> PointerType; typedef std::shared_ptr<const Dog> ConstPointerType; enum Type{HATE_CAT_DOG, HUSKY, GOLDEN_RETRIEVER} ... } fish.h class Fish { ... }; Any suggestion will be helpful.

    Read the article

  • Parse text file on click - and then display

    - by John R
    I am thinking of a methodology for rapid retrieval of code snippets. I imagine an HTML table with a setup like this: one two ... ten one oneTwo() oneTen() two twoOne() twoTen() ... ten tenOne() tenTwo() When a user clicks a function in this HTML table, a snippet of code is shown in another div tag or perhaps a popup window (I'm open to different solutions). I want to maintain only one PHP file named utitlities.php that contains a class called 'util'. This file & class will hold all the functions referenced in the above table (it is also used on various projects and is functional code). A key idea is that I do not want to update the HTML documentation everytime I write/update a new function in utilities.php. I should be able to click a function in the table and have PHP open the utilities file, parse out the apropriate function and display it in an HTML window. Questions: 1) I will be coding this in PHP and JavaScript but am wondering if similar scripts are available (for all or part) so I don't reinvent the wheel. 2) Quick & easy Ajax suggestions appreciated too (probably will use jquery, but am rusty). 3) Methodology for parsing out the functions from the utilities.php file (I'm not to good with regex).

    Read the article

  • No access to Samba shares

    - by koanhead
    I have three shared folders in my local home directory- that is to say, on my Ubuntu desktop's /home/me/. All were set up using "Sharing Options" in Nautilus' right-click menu. The standard "Music" and "Videos" folders are configured identically: the "Guest Access" box is checked, but the "Allow others to create and delete" is not. The third folder, called "shared", is configured to not allow Guest access but to allow others to modify files. I have not altered /etc/samba/smb.conf by hand, I have only used Sharing Options to create and modify these so-called "shares". My roommates have two Windows 7 computers and one Ubuntu Netbook Remix netbook. I have the aforementioned desktop machine and laptop running 10.04. None of these machines can access any of the shares. Attempts to access the Guest shares result in the message \\machine\directory is not accessible. The network name could not be found. This is the error message generated by a VM running Windows 2000. The other Windows machines generate a similar error. The Ubuntu laptop gives the error Unable to mount location: Failed to mount Windows share. Hurrah, once again, for informative error messages. That really helps a lot. When attempting to browse the folder called "shared" from the laptop, I'm confronted with a password dialog. This behavior is the same will all machines I've tried in the situation. On entering my username and password for the account to which the shares belong, the password dialog briefly disappears and is replaced with an identical dialog. No error message, useful or not, appears. When attempting to browse this folder with the VM, the outcome is the same except that the password dialog helpfully states "incorrect username or password". My assumption is that the username and password in question is that of the user which owns the shares. I have tried all other username and password combinations available in this context and the outcome is the same. I would like to be able to share files. Sharing them with Windows machines is a nice feature, or would be if it was available. Really I consider sharing files between two machines with the same version of the same operating system kind of a minimum condition for network usability. Samba last functioned reliably for me more than ten years ago. I have attempted to use it on and off since then with only intermittent success. Oh, and "Personal File Sharing" from the Preferences menu does not result in an entry in Places → Network → my-server. In fact, the old entry "MY-SERVER" goes away and is replaced by "koanhead's public files on my-server", which when I attempt to open it from the laptop gives a "DBus.Error.NoReply: Message did not receive a reply." I know I come here and gripe about Ubuntu a lot, but on the other hand I spend literally hours every day trying to fix things in Ubuntu. It's a good system which aspires to greatness, which is why things like this either Need to work; or Be adequately documented. Ideally both would be the case. Anyway, rant over. Hopefully someone will have some insight on this issue. Thanks all who bother to read this wall o'text for your time.

    Read the article

  • No access to Samba shares

    - by koanhead
    I have three shared folders in my local home directory- that is to say, on my Ubuntu desktop's /home/me/. All were set up using "Sharing Options" in Nautilus' right-click menu. The standard "Music" and "Videos" folders are configured identically: the "Guest Access" box is checked, but the "Allow others to create and delete" is not. The third folder, called "shared", is configured to not allow Guest access but to allow others to modify files. I have not altered /etc/samba/smb.conf by hand, I have only used Sharing Options to create and modify these so-called "shares". My roommates have two Windows 7 computers and one Ubuntu Netbook Remix netbook. I have the aforementioned desktop machine and laptop running 10.04. None of these machines can access any of the shares. Attempts to access the Guest shares result in the message \\machine\directory is not accessible. The network name could not be found. This is the error message generated by a VM running Windows 2000. The other Windows machines generate a similar error. The Ubuntu laptop gives the error Unable to mount location: Failed to mount Windows share. Hurrah, once again, for informative error messages. That really helps a lot. When attempting to browse the folder called "shared" from the laptop, I'm confronted with a password dialog. This behavior is the same will all machines I've tried in the situation. On entering my username and password for the account to which the shares belong, the password dialog briefly disappears and is replaced with an identical dialog. No error message, useful or not, appears. When attempting to browse this folder with the VM, the outcome is the same except that the password dialog helpfully states "incorrect username or password". My assumption is that the username and password in question is that of the user which owns the shares. I have tried all other username and password combinations available in this context and the outcome is the same. I would like to be able to share files. Sharing them with Windows machines is a nice feature, or would be if it was available. Really I consider sharing files between two machines with the same version of the same operating system kind of a minimum condition for network usability. Samba last functioned reliably for me more than ten years ago. I have attempted to use it on and off since then with only intermittent success. Oh, and "Personal File Sharing" from the Preferences menu does not result in an entry in Places → Network → my-server. In fact, the old entry "MY-SERVER" goes away and is replaced by "koanhead's public files on my-server", which when I attempt to open it from the laptop gives a "DBus.Error.NoReply: Message did not receive a reply." I know I come here and gripe about Ubuntu a lot, but on the other hand I spend literally hours every day trying to fix things in Ubuntu. It's a good system which aspires to greatness, which is why things like this either Need to work; or Be adequately documented. Ideally both would be the case. Anyway, rant over. Hopefully someone will have some insight on this issue. Thanks all who bother to read this wall o'text for your time.

    Read the article

  • Subband decomposition using Daubechies filter

    - by misha
    I have the following two 8-tap filters: h0 ['-0.010597', '0.032883', '0.030841', '-0.187035', '-0.027984', '0.630881', '0.714847', '0.230378'] h1 ['-0.230378', '0.714847', '-0.630881', '-0.027984', '0.187035', '0.030841', '-0.032883', '-0.010597'] Here they are on a graph: I'm using it to obtain the approximation (lower subband of an image). This is a(m,n) in the following diagram: I got the coefficients and diagram from the book Digital Image Processing, 3rd Edition, so I trust that they are correct. The star symbol denotes one dimensional convolution (either over rows or over columns). The down arrow denotes downsampling in one dimension (either over rows, or columns). My problem is that the filter coefficients for h0 and h1 sum to greater than 1 (approximately 1.4 or sqrt(2) to be exact). Naturally, if I convolve any image with the filter, the image will get brighter. Indeed, here's what I get (expected result on right): Can somebody suggest what the problem is here? Why should it work if the convolution filter coefficients sum to greater than 1? I have the source code, but it's quite long so I'm hoping to avoid posting it here. If it's absolutely necessary, I'll put it up later. EDIT What I'm doing is: Decompose into subbands Filter one of the subbands Recompose subbands into original image Note that the point isn't just to have a displayable subband-decomposed image -- I have to be able to perfectly reconstruct the original image from the subbands as well. So if I scale the filtered image in order to compensate for my decomposition filter making the image brighter, this is what I will have to do: Decompose into subbands Apply intensity scaling Filter one of the subbands Apply inverse intensity scaling Recompose subbands into original image Step 2 performs the scaling. This is what @Benjamin is suggesting. The problem is that then step 4 becomes necessary, or the original image will not be properly reconstructed. This longer method will work. However, the textbook explicitly says that no scaling is performed on the approximation subband. Of course, it's possible that the textbook is wrong. However, what's more possible is I'm misunderstanding something about the way this all works -- this is why I'm asking this question.

    Read the article

  • Best practices for displaying large number of images as thumbnails in c#

    - by andySF
    I got to a point where it's very difficult to get answers by debugging and tracing object, so i need some help. What I'm trying to do: A history form for my screen capture pet project. The history must list all images as thumbnails (ex: picasa). What I've done: I created a HistoryItem:UserControl. This history item has a few buttons, a check box, a label and a picture box. The buttons are for delete/edit/copy image. The check box is used for selecting one or more images and the label is for some info text. The picture box is getting the image from a public property that is a path and a method creates a proportional thumbnail to display it when the control has been loaded. This user control has two public events. One for deleting the image and one for bubbling the events for mouse enter and mouse leave trough all controls. For this I use EventBroadcastProvider. The bubbling is useful because wherever I move the mouse over the control, the buttons appear. The dispose method has been extended and I manually remove the events. All images are loaded by looping a xml file that contains the path of all images. For each image in this XML I create a new HitoryItem that is added (after a little coding to sort and limit the amount of images loaded) to a flow layout panel. The problem: When I lunch the history form, and the flow layout panel is populated with my HistoryItem custom control, my memory usage increases drastically.From 14Mb to around 100MB with 100 images loaded. By closing the history form and disposing whatever I could dispose and even trying to call GC.Collect() the memory increase remain. I search for any object that could not be disposed properly like an image or event but wherever I used them they are disposed. The problem seams to be from multiple sources. One is that the events for bubbling are not disposing properly, and the other is from the picture box itself. All of this i could see by commenting all the code to a limited version when only the custom control without any image processing and even events is loaded. Without the events the memory consumption is reduced by axiomatically 20%. So my real question is if this logic, flow layout panels and custom controls with picture boxes, is the best solution for displaying large amounts of images as thumbnails. Thank you!

    Read the article

  • "file not found" error while commiting

    - by AntonAL
    I have a working copy, checked out from SVN repository. When i'm trying to commit, i get following error: svn: File not found: revision 57, path '/trunk/path/to/my/file/logo-mini.jpg' I've found this file in the repo and noticed, that it has only one revision - 58. I don't understand, why SVN complains about this file, when it is presented and why it points to revision 57 instead of 58 ? I've also renamed the grand-grand-grand-parent folder of this file. Possible, this is an issue ... Update Detailed error description, that i've got from Cornerstone app (Mac OS X): Description : Could not find the specified file. Suggestion : Check that the path you have specified is correct. Technical Information ===================== Error : V4FileNotFoundError Exception : ZSVNNoSuchEntryException Causal Information ================== Description : Commit failed (details follow): Status : 160013 File : subversion/libsvn_client/commit.c, 867 Description : File not found: revision 57, path '/trunk/assets/themes/base/article-content/images/logo-mini.jpg' Status : 160013 File : subversion/libsvn_fs_fs/tree.c, 663 So, i've renamed "/trunk/assets/themes directory" to "/trunk/assets/skins", while improving project structure. I've tried following: updating /trunk/assets/themes directory cleaning deleting from filesytem and checking out again reverting entire /trunk/assets/themes directory to the HEAD revision. Even this does't helps. Still getting the same error. I've got no results.

    Read the article

  • InnoDB: Error: log file ./ib_logfile0 is of different size

    - by jack
    I just added the following lines in /etc/mysql/my.cnf after I converted one database to use InnoDB engine. innodb_buffer_pool_size = 2560M innodb_log_file_size = 256M innodb_log_buffer_size = 8M innodb_flush_log_at_trx_commit = 2 innodb_thread_concurrency = 16 innodb_flush_method = O_DIRECT But it raise "ERROR 2013 (HY000) at line 2: Lost connection to MySQL server during query" error restarting mysqld. And mysql error log shows the following InnoDB: Error: log file ./ib_logfile0 is of different size 0 5242880 bytes InnoDB: than specified in the .cnf file 0 268435456 bytes! 100118 20:52:52 [ERROR] Plugin 'InnoDB' init function returned error. 100118 20:52:52 [ERROR] Plugin 'InnoDB' registration as a STORAGE ENGINE failed. 100118 20:52:52 [ERROR] Unknown/unsupported table type: InnoDB 100118 20:52:52 [ERROR] Aborting So I commented out this line # innodb_log_file_size = 256M And it restarted mysql successfully. I wonder what's the "5242880 bytes of log file" showed in mysql error? It's the first database on InnoDB engine on this server so when and where is that log file created? In this case, how can I enable innodb_log_file_size directive in my.cnf? EDIT I tried to delete /var/lib/mysql/ib_logfile0 and restart mysqld but it still failed. It now shows the following in error log. 100118 21:27:11 InnoDB: Log file ./ib_logfile0 did not exist: new to be created InnoDB: Setting log file ./ib_logfile0 size to 256 MB InnoDB: Database physically writes the file full: wait... InnoDB: Progress in MB: 100 200 InnoDB: Error: log file ./ib_logfile1 is of different size 0 5242880 bytes InnoDB: than specified in the .cnf file 0 268435456 bytes! Resolution It works now after deleted both ib_logfile0 and ib_logfile1 in /var/lib/mysql

    Read the article

  • BAT file will not run from Task Scheduler but will from Command Line

    - by wtaylor
    I'm trying to run a BAT script from Task Scheduler in Windows 2008 R2 and it runs for 3 seconds and then stops. It says it successfully completes but I know it doesn't. I can run this script from the command line directly, and it runs just fine. The bat file I'm running actually deletes files older than 7 days using "forfiles" then I'm mapping a network drive, moving the files across the network using robocopy, and then closing the network connection. I have taken the network and copy options out of the file and it still does the same thing. Here is how my file looks: rem This will delete the files from BBLEARN_stats forfiles -p "E:\BB_Maintenance_Data\DB_Backups\BBLEARN_stats" -m *.* -d -17 -c "cmd /c del @file" rem This will delete the files from BBLEARN_cms_doc forfiles -p "E:\BB_Maintenance_Data\DB_Backups\BBLEARN_cms_doc" -m *.* -d -14 -c "cmd /c del @path" rem This will delete the files from BBLEARN_admin forfiles -p "E:\BB_Maintenance_Data\DB_Backups\BBLEARN_admin" -m *.* -d -10 -c "cmd /c del @path" rem This will delete the files from BBLEARN_cms forfiles -p "E:\BB_Maintenance_Data\DB_Backups\BBLEARN_cms" -m *.* -d -10 -c "cmd /c del @path" rem This will delete the files from attendance_bb forfiles -p "E:\BB_Maintenance_Data\DB_Backups\attendance_bb" -m *.* -d -10 -c "cmd /c del @path" rem This will delete the files from BBLearn forfiles -p "E:\BB_Maintenance_Data\DB_Backups\BBLEARN" -m *.* -d -18 -c "cmd /c del @path" rem This will delete the files from Logs forfiles -p "E:\BB_Maintenance_Data\logs" -m *.* -d -10 -c "cmd /c del @path" NET USE Z: \\10.20.102.225\coursebackups\BB_DB_Backups /user:cie oly2008 ROBOCOPY E:\BB_Maintenance_Data Z: /e /XO /FFT /PURGE /NP /LOG:BB_DB_Backups.txt openfiles /disconnect /id * NET USE Z: /delete /y This is happening on 2 servers when trying to run commands from inside a BAT file. The other server is giving an error if (0xFFFFFFFF) but that file is running a CALL C:\dir\dir\file.bat -options and I've used commands like that before in Server 2003. Here is the file for this file: call C:\blackboard\apps\content-exchange\bin\batch_ImportExport.bat -f backup_batch_file.txt -l 1 -t archive NET USE Z: \\10.20.102.225\coursebackups\BB_Course_Backups /user:cie oly2008 ROBOCOPY E:\ Z: /move /e /LOG+:BB_Move_Course_Backups.txt openfiles /disconnect /id * NET USE Z: /delete /y Any help would be GREAT. Thanks

    Read the article

  • vhost.conf file in PLESK not working as intended

    - by Saif Bechan
    I have configured a vhost file for my domain but it does not seem to work. These are the steps I took, please correct me if I am wrong. First I made a file called vhost.conf in: /var/www/vhosts/*domain*/conf/vhost.conf The content of the vhost file looks like this: <Directory /var/www/vhosts/*domain*/httpdocs> php_admin_flag engine on php_admin_flag display_errors on </Directory> Now in my /etc/php.ini i set display_errors=Off After everything i rebuild with: /usr/local/psa/admin/sbin/websrvmng -a But I don't see the any errors in my page. When i turn on the display_errors in /etc/php.ini only then can I see the errors. I know for a fact that the vhost file is read, because when i type nonsense values i get an error when restarting apache saying there are errors in the vhost file. Anyone know what the problem can be. Should there be special settings in either the php.ini file or the httpd.conf file. The httpd.conf i edit is in /etc/httpd/conf/httpd.conf. Is this the file that PLESK uses or is there another, because the values i see there do not really reflect the http folders of my domain. The httpd file looks like this now. # The document root DocumentRoot "/var/www/html" # i guess this is the base directory <Directory /> Order Deny,Allow Deny from all Options None AllowOverride None </Directory> # And i guess here are all my domains located, but there aren't any here <Directory "/var/www/html"> Options None AllowOverride None Order allow,deny Allow from all </Directory> Only this directory /var/www/html is not used by me, I use the directory /var/www/vhosts. The only folder found in /var/www/html is a folder called awstats. Does plesk use other files, and where are they located. I hope this all makes sense to anyone, and i hope i can find a solution

    Read the article

  • yum error when installing memcached

    - by Jack
    Hi, trying to install memcached with "yum install memcached" and i'm getting all these errors which I have no idea how to solve. Setting up Install Process Resolving Dependencies -- Running transaction check --- Package memcached.x86_64 0:1.4.5-1.el5.rf set to be updated -- Processing Dependency: perl(AnyEvent) for package: memcached -- Processing Dependency: perl(AnyEvent::Socket) for package: memcached -- Processing Dependency: perl(AnyEvent::Handle) for package: memcached -- Processing Dependency: perl(YAML) for package: memcached -- Processing Dependency: perl(Term::ReadKey) for package: memcached -- Processing Dependency: libevent-1.1a.so.1()(64bit) for package: memcached -- Running transaction check --- Package compat-libevent-11a.x86_64 0:3.2.1-1.el5.rf set to be updated --- Package memcached.x86_64 0:1.4.5-1.el5.rf set to be updated -- Processing Dependency: perl(AnyEvent) for package: memcached -- Processing Dependency: perl(AnyEvent::Socket) for package: memcached -- Processing Dependency: perl(AnyEvent::Handle) for package: memcached -- Processing Dependency: perl(YAML) for package: memcached -- Processing Dependency: perl(Term::ReadKey) for package: memcached -- Finished Dependency Resolution memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(AnyEvent::Socket) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(AnyEvent) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(AnyEvent::Handle) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(YAML) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) memcached-1.4.5-1.el5.rf.x86_64 from rpmforge has depsolving problems -- Missing Dependency: perl(Term::ReadKey) is needed by package memcached-1.4.5-1.el5.rf.x86_64 (rpmforge) Packages skipped because of dependency problems: compat-libevent-11a-3.2.1-1.el5.rf.x86_64 from rpmforge memcached-1.4.5-1.el5.rf.x86_64 from rpmforge The perl modules that its complaining about are already installed. Any ideas?

    Read the article

< Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >