Search Results

Search found 28760 results on 1151 pages for 'search folder'.

Page 443/1151 | < Previous Page | 439 440 441 442 443 444 445 446 447 448 449 450  | Next Page >

  • Small change in MVVM Light Toolkit templates for Blend 4 RC

    - by Laurent Bugnion
    Ah, the joy of new releases… You will find that the MVVM Light Toolkit works fine with Visual Studio 2010 RTM and Blend 4 RC except for a few adjustments: Blend templates The path to the Expression Blend 4 project templates changed. If you start Expression Blend 4 RC now, you will likely not see the MVVM Light templates in the New Project dialog.   New Project dialog with MVVM Light To restore the templates, follow the steps: Open Windows Explorer Navigate to C:\Users\[username]\Documents\Expression (or simply type My Documents in Windows Explorer and then open the Expression folder). Change the name of the “Blend 4 beta” folder into “Blend 4”. That’s it, you should now see the templates in the New Project dialog in Blend 4. Note that since the new name is “Blend 4”, I hope that I won’t need to do the same exercise when Blend 4 RTM is released! Windows Phone 7 templates Since the Windows Phone 7 tools are not ready yet for Visual Studio 2010 RTM and Blend 4 RC, the templates in the Silverlight for Windows Phone folders will not work. You will get an error if you try to create a new such project in the newly released environment. I hesitated to remove these templates from the current packages, but honestly that is a lot of trouble for a very short time before the tools for Windows Phone 7 are released (note: I don’t have any information as to when these tools will be released). In the mean time, just don’t create a WinPhone7 application. Reminder: If you want to write code for Windows Phone 7, you need to keep the Visual Studio 2010 RC as well as Expression Blend 4 beta. Updated package I uploaded an update to the Blend 4 templates. It is available like before on the “Install manually” page and on the Codeplex page.   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Ubuntu 11.04 and OpenLDAP - where is the config?

    - by Tom SKelley
    I've been asked to setup a multimaster LDAP environment on Ubuntu 11.04 - instead of a single master server. I cloned the master server and recreated it into two VMs. I am trying to follow the instructions on the OpenLDAP documentation here: http://www.openldap.org/doc/admin24/replication.html and it talks about modifying the cn=config tree within LDAP. The subdirectory tree appears to be there at: /etc/ldap/slapd.d/ and a slapcat -b cn=config drops out a load of config information. When I try to connect using a browser and the admin bind credentials: ldapsearch -D '<adminDN>' -w <password> -b 'cn=config' I get: # extended LDIF # # LDAPv3 # base <> (default) with scope subtree # filter: (objectclass=*) # requesting: ALL # # search result search: 2 result: 32 No such object I don't see the config context when I connect via an LDAP browser either. I'm sure I'm missing something, but I can't see what it is!

    Read the article

  • Installing Bugzilla on Ubuntu 9.04 and Plesk

    - by makeflo
    Hey guys. I'm trying to install the latest Bugzilla version on my ubuntu server. (Want to use a subdomain like bugs.domain.com) I already installed all necessary perl modules and check_modules.pl doesn't show any errors. But when I'm running the testserver.pl script I get the following: TEST-OK Webserver is running under group id in $webservergroup TEST-FAILED Fetch of images/padlock.png failed I'm also not able to visit ANY file within the bugzilla folder from the browser. I'm always getting a 404 error. The bugzilla folder and all containing files are set to apache as the owner. I tried to enter the apache configuration form the installation guide in the http.include file of the domain and in the vhosts.conf file of the subdomain as well. I don't know what to do... Playing with plesks' suexecgroup doesn't bring any solution... I hope you can help me! Thanks in advance!

    Read the article

  • Ad-hoc String Manipulation With Visual Studio

    - by Liam McLennan
    Visual studio supports relatively advanced string manipulation via the ‘Quick Replace’ dialog. Today I had a requirement to modify some html, replacing line breaks with unordered list items. For example, I need to convert: Infrastructure<br/> Energy<br/> Industrial development<br/> Urban growth<br/> Water<br/> Food security<br/> to: <li>Infrastructure</li> <li>Energy</li> <li>Industrial development</li> <li>Urban growth</li> <li>Water</li> <li>Food security</li> This cannot be done with a simple search-and-replace but it can be done using the Quick Replace regular expression support. To use regular expressions expand ‘Find Options’, check ‘Use:’ and select ‘Regular Expressions’ Typically, Visual Studio regular expressions use a different syntax to every other regular expression engine. We need to use a capturing group to grab the text of each line so that it can be included in the replacement. The syntax for a capturing group is to replace the part of the expression to be captured with { and }. So my regular expression: {.*}\<br/\> means capture all the characters before <br/>. Note that < and > have to be escaped with \. In the replacement expression we can use \1 to insert the previously captured text. If the search expression had a second capturing group then its text would be available in \2 and so on. Visual Studio’s quick replace feature can be scoped to a selection, the current document, all open documents or every document in the current solution.

    Read the article

  • Outlook 2010 - Export of an Exchange OST to PST creates files with different sizes each time

    - by Jiri Pik
    This is a most weird issue. I have a couple of exchange OST mailboxes, and just for security, I am exporting them using File / Import / Export to a file / Export to PST file. If I run the export consecutively, it always creates files with different file sizes, WITH NO ERROR OR WARNING that something went wrong. The files should be of the same size as you run it right after the previous backup finished. I found out that if the filesize is substantially lower, then a reboot and back up can fix this up. What's your insight into this problem? What could cause that the files have different sizes and what could have caused that there is no warning? I suspected some Windows Search issue as sometimes the backup fails with a dialog error stating that Windows Search terminated the export.

    Read the article

  • How do I get write access to ubuntu files from Windows?

    - by Steven
    I'm running Ubuntu 11.10 on my Virtual Machine as a web server. I've mounted the W:/ drive in Win 7 to my /www folder in Ubuntu. I can read the files, but I'm not able to write to the files. In Samba, I have created the following user: <www-data> = "<www-data>" And given guest ok for the www folder: [www] comment = Ubuntu WWW area path = /var/www browsable = yes guest ok = yes read only = no create mask = 0755 ;directory mask = 0775 force user = www-data force group = www-data I've also run sudo chmod -R 755 www to make ensure correct rw access. What am I missing in order to get write access to my ubuntu files from Windows?

    Read the article

  • Dynamic procmail filters

    - by WombaT
    i need procmail to place incoming mail into specific folder depending on some set of rules. I know how i can accomplish this, but i need to write static set of rules in a specific file. What i really need is to configure procmail to use rules stored in mysql database. How i can do this? I've read a bit about that and one solution i found is to pipe message to a php/perl script and return a folder name to place message. But i have completely no i idea how to use php script as a rule and then use its return value.

    Read the article

  • double-click does not open the default program

    - by Chang
    I installed Ubuntu 12.04, with texlive-full and texworks. When I double-click a .tex file in Nautilus, it pops up a Do you want to run "xxxxxxxx.tex", or display it contents? "xxxxxxxx.tex" is an executable text file. Run in Terminal Display Cancel Run If I choose Display, it opens texworks. How can I make it open without seeing the above conversation window? By the way, is .tex file indeed an executable file? ADDED Just for the case, my ~/.local/share/applications/mimeapps.list file looks like the following: [Default Applications] text/html=google-chrome.desktop x-scheme-handler/http=google-chrome.desktop x-scheme-handler/https=google-chrome.desktop x-scheme-handler/about=google-chrome.desktop x-scheme-handler/unknown=google-chrome.desktop text/x-tex=texworks.desktop x-scheme-handler/mailto=google-chrome.desktop [Added Associations] text/x-tex=texworks.desktop; text/x-bibtex=jabref.desktop;gedit.desktop; I observe texworks are both up and down. Should I remove one? ADDED This does not happen with all .tex files. In fact, I am currently using Ubuntu 12.04 under Virtual Box with Windows 7 host. I have my Dropbox account synced with the Windows 7 host, and I access files in Dropbox in Ubuntu through Virtual Box's shared folder functionality. (I didn't install Dropbox client in Ubuntu.) Files in Dropbox are owned by root with group vboxsf. My personal account is in the group vboxsf. It seems that I have to uncheck the option for "executable", but I have all my .tex files in the Dropbox shared folder. Would there be any workaround?

    Read the article

  • User account shows two Downloads folders

    - by Chris Lieb
    I have my user account on my D drive and junction'd to the C:\users folder. I accidentally moved my profile Downloads folder (C:\users\me\Downloads) and then moved it back to its path on the D drive (C:\me\Downloads). After doing this, the directory tree for my user profile lists two Downloads directories, one located at C:\users\me and one at D:\me. I tried deleting the directory from the D drive, then restoring it from the Recycle Bin to the proper location on the C drive (actually the D drive, accessed through the junction), but it gave me the two Downloads directories again. Is there some way to fix this so that the only listing is for the C:\users\me\Downloads directory, like it was to begin with?

    Read the article

  • App installed in ~/usr launches from terminal but not Applications menu (or why does setting ld_library_path in .profile not work as it should)

    - by levesque
    I have built and installed an application under a directory of my choosing, let's say under /home/jim/usr, so files have been put in three-four folders, all under this $HOME/usr folder (e.g., bin, include, lib, share, etc.). I can launch this application from the command line just fine as I added the proper paths to my environement variables PATH and LD_LIBRARY_PATH in ~/.bashrc. I added the same paths to the ~/.profile file, which, if I'm not mistaken, is supposed to be parsed by Ubuntu. Doesn't work. Nothing. Where can I go from there? EDIT: I logged out/in and restarted my computer. Both didn't change a thing. The problem seems to come from the fact that no matter what I do the LD_LIBRARY_PATH environment variable is not properly passed to Ubuntu. Using log files, I found that the application I'm trying to run in this example doesn't find one it's dependencies located in ~/usr/lib. One solution would be to add the /home/jim/usr/lib folder inside a file located in /etc/ld.so.conf.d/, but I don't have admin rights on this machine. Making a wrapper script like this one works: #!/bin/bash export LD_LIBRARY_PATH=$HLOC/usr/lib application &> $HOME/application_messages.log but that would force me to wrap all my home compiled applications with this script. Any ideas? Why does Ubuntu/Gnome remove the LD_LIBRARY_PATH environment variable from my set variables? Is it because trying to do this is bad practice? UPDATE (and solution): As found by Christopher, there is a bug report about this on launchpad. LD_LIBRARY_PATH is unset after parsing of the ~/.profile file. See the bug report. Seems the only solution for now is to make a wrapper script.

    Read the article

  • Rsync plugin to many local wordpress installs via script or cli

    - by Nick Abbey
    I am maintaining a large number of wordpress installs on a production server, and we are looking to deploy InfiniteWP for managing these installs. I am looking for a way to script the distribution of the plugin folder to all of these installs. On server wp-prod, all sites are stored in /srv//site/ The plugin needs to be copied from ~/iws-plugin to /srv//site/wp-content/plugins/ Here's some pseudo code to explain what I need to do: array dirs = <all folders in /srv> for each d in dirs if exits "/srv/d/site/wp-content/plugins" rsync -avzh --log-file=~/d.log ~/plugin_base_folder /srv/d/site/wp-content/plugins/ else touch d.log echo 'plugin folder for "d" not found' >> ~/d.log end end I just don't know how to make it happen from the cli or via bash. I can (and will) tinker with a bash or ruby script on my test server, but I'm thinking the command-line-fu here on SF is strong enough to handle this issue much more quickly than I can hack together a solution. Thanks!

    Read the article

  • Proof Identify stolen computer getting computer identification info from Launchpad bugs and comparing

    - by Kangarooo
    I sold my old laptop to neighbours and it was stolen from them. Well i think i have found thief so i want to check his computer id and compare it to my old Launchpad bugs id. How in Launchpad i can find from my bugs: Motherboard HDD Somthing else that can help identify it Maybe how to recover or find some overwritten files (couse now there is windows) I found in Launchpad one my bugs has LSPCI autogenerated from bug 682846 https://launchpadlibrarian.net/70611231/Lspci.txt but i dont see any id that can be used to identify specificly my comp. This can be used to identify many same models. Or i missed something in there? And what commands should i use to get all identification on that comp in one go fast? Just lspci? How to get same lspci as it is in that Launchpad link? Now testing laspci on my computer i dont get so much info. Also im now doing a search in my external hdd where i have many backups and maybe i have there result from lspci. So what containing keywords would help doing search with for small lspci and full reports ive done? I might have done sudo lshw somefilename

    Read the article

  • How can I rebuild the index in Thunderbird 3.0?

    - by Martin
    Hello, I have just deleted a lot of old messages (about 3,000) from my Thunderbird 3.0 profile. When I now use the new search feature (search all messages), TB still finds the deleted ones. I deleted them this way: I moved the messages to an own "archive" folder (not the built-in archive feature). Then I stopped TB and moved the archive files and folders to a different place on my file system. Then restarted TB. I archive my messages this way for years now. So, it seems that Thunderbird does not notice the deletion of my messages, thus the index is not updated. How can I tell TB to instantly rebuild the index?

    Read the article

  • Backup Dropbox to Amazon Glacier

    - by joekr
    i'm using Dropbox for Backup which means i keep all my files in my Dropbox folder (encrypted using encfs but that should not be relevant). I like this solution because it is automatic and keeps copies of my files on several machines at different locations. The only thing i could see go wrong is that Dropbox has some sort of bug that tells all my machines to delete the files. So currently i do a Backup of the Dropbox folder to an external Harddrive. With Amazon Glacier it seems affordable to automate Backup snapshots of my Dropbox. What i am looking for is a tool that will do this for me - the base case scenario would be that files would go from Dropbox (using their API) directly to Amazon as uploading the ~80GB from my home connection would take forever... Thanks!

    Read the article

  • How to prevent duplication of content on a page with too many filters?

    - by Vikas Gulati
    I have a webpage where a user can search for items based on around 6 filters. Currently I have the page implemented with one filter as the base filter (part of the url that would get indexed) and other filters in the form of hash urls (which won't get indexed). This way the duplication is less. Something like this example.com/filter1value-items#by-filter3-filter3value-filter2-filter2value Now as you may see, only one filter is within the reach of the search engine while the rest are hashed. This way I could have 6 pages. Now the problem is I expect users to use two filters as well at times while searching. As per my analysis using the Google Keyword Analyzer there are a fare bit of users that might use two filters in conjunction while searching. So how should I go about it? Having all the filters as part of the url would simply explode the number of pages and sticking to the current way wouldn't let me target those users. I was thinking of going with at max 2 base filters and rest as part of the hash url. But the only thing stopping me is that it would lead to duplication of content as per Google Webmaster Tool's suggestions on Url Structure.

    Read the article

  • Change a Foreign Action's Display Text

    - by Geertjan
    I want the display text on an Action on a Node to show something about the underlying object. But the Action is registered somewhere in the layer (i.e., in the registry), i.e., I have no control over it. How do I change the display text in this scenario? Here's how. Below I look in the Actions/Events folder, iterate through all the Actions registered there, look for an Action with display text starting with "Edit", change it to display something from the underlying object, wrap a new Action around that Action, build up a new list of Actions, and return those (together with all the other Actions in that folder) from "getActions" on my Node: @Override public Action[] getActions(boolean context) { List<Action> newEventActions = new ArrayList<Action>(); List<? extends Action> eventActions = Utilities.actionsForPath("Actions/Events"); for (final Action action : eventActions) { String value = action.getValue(Action.NAME).toString(); if (value.startsWith("Edit")) { Action editAction = new AbstractAction("Edit " + getLookup().lookup(Event.class).getPlace()) { @Override public void actionPerformed(ActionEvent e) { action.actionPerformed(e); } }; newEventActions.add(editAction); } else { newEventActions.add(action); } } return newEventActions.toArray(new Action[eventActions.size()]); } If someone knows of a better way, please let me know.

    Read the article

  • esxi 5.1 - copy paste host config doesn't work?

    - by w--
    I followed these instructions for esxi 5.1 to enable copy paste from the host config. (search for "To enable this option for all the virtual machines in the ESX/ESXi host:") http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=1026437 But copy and paste doesn't work for new VMs or existing VMs. Did I do something wrong somewhere? I've rebooted VMs and even rebooted host. I can confirm that if I do a per VM configuration, it works. Just not a central host config. Here is a screenshot of my /etc/vmware/config file

    Read the article

  • yum not able to install a package

    - by shadyabhi
    [root@mypc yum.repos.d]# yum search perl-Locale-gettext Loaded plugins: dellsysid, fastestmirror Repository tmz-puppet is listed more than once in the configuration Loading mirror speeds from cached hostfile * atomic: www6.atomicorp.com * base: mirror.trouble-free.net * epel: mirrors.tummy.com * extras: eq-centosrepo.hopto.org * rpmforge: mirror.hmc.edu * updates: mirror.team-cymru.org =================================================================== N/S Matched: perl-Locale-gettext ==================================================================== perl-Locale-gettext.x86_64 : Internationalization for Perl Name and summary matches only, use "search all" for everything. [root@mypc yum.repos.d] And [root@mypc yum.repos.d]# yum install perl-Locale-gettext Loaded plugins: dellsysid, fastestmirror Repository tmz-puppet is listed more than once in the configuration Loading mirror speeds from cached hostfile * atomic: mir01.syntis.net * base: mirrors.gigenet.com * epel: mirror.us.leaseweb.net * extras: centos.mirror.lstn.net * rpmforge: mirror.hmc.edu * updates: centos.mirror.choopa.net Setting up Install Process Nothing to do [root@mypc yum.repos.d]# What is going wrong here?

    Read the article

  • USB mass storage couldn't get mounted

    - by revo
    It's my android phone SD card which was indicated damaged by android yesterday night, out of the blue! I put it directly to a USB port with a USB SD card holder case, so in that way I can recover it with TestDisk, which I had experienced before on a similar situation. I also noticed that there is a change in file system and capacity: File System : RAW Capacity : 0 (unknown capacity) Also TestDisk doesn't show it on its partitions list. A 2 GB SD card is not that important in price but I've a lot of files and medias which I need them. Used a mini card reader, TestDisk displayed it on its list but a quick search and or a deep search doesn't have any results No partition found or selected for recovery and then I should quit the program. Your help is appreciated. Update #2 lsusb output: Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 004 Device 002: ID 04f3:0234 Elan Microelectronics Corp. Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 001 Device 002: ID 058f:6366 Alcor Micro Corp. Multi Flash Reader Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 009 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub Bus 008 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 007 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub Bus 006 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

    Read the article

  • Legal issues regarding embedding a toolbar into a browser [closed]

    - by OmarOthman
    We are in the process of developing a software that provides service to internet users and we would like to ask about the legal liabilities of some issues. Of course, everything is to be done with the consent of the user of our software but our concern is about third party tools and services that may be invoked/used by our product. In particular, these are the concerns: (1) Embedding a toolbar to an existing browser. This screenshot is an example, where the words in the highlighted toolbar are passed to www.google.com for searching, and the contents of the window are the results of the search. I want to know if any consent should be obtained before such a toolbar can be embedded in a web browser, whether there are any legal requirements by the web browser; whether different web browsers have different requirements (at least for Internet Explorer, Firefox, Chrome, Opera and Safari). (2) Invoking a free website from that toolbar (like Google’s search page). The screenshot above demonstrates such an existing toolbar. (3) Full ownership and unrestricted access to the data entered to this toolbar. In the screenshot above, I want to take the words (translation english to spanish) and own them, i.e. storing them in my database and do some processing on them. (4) Ability to track the pages entered by the user starting from that free website. In the screenshot above, you can notice that the user opted only for the third result, whose URL is translate.google.com. I want to have access to this and all URLs clicked from this page for some processing as well. This is a commercial application, so I need a very concrete, precise and reference-supported answer.

    Read the article

  • How can I make gitosis distinguish between two users with the same username

    - by bryan kennedy
    I have a gitosis system that seems to be working correctly except for a common problem we run into where I can't distingush permissions between two users who have the same username, but different hosts. For example: [email protected] 's SSH key is in the key folder. And so is [email protected] 's SSH is also in the key folder. These two jsmith's are two different people on two different computers. However, when I configure them in the gitosis.conf file with the usernames jsmith@computer or jsmith@machine, it seems like each user just gets the same permission. Can gitosis not distinguish the full username (name and host)? If not, how do I deal with multiple users accessing our system with common usernames? Thanks for any help.

    Read the article

  • Outlook macro runs through 250 iterations before failing with error [migrated]

    - by Senoculus
    Description: I have an Outlook macro that loops through selected emails in a folder and writes down some info to a .csv file. It works perfectly up until 250 before failing. Here is some of the code: Open strSaveAsFilename For Append As #1 CountVar = 0 For Each objItem In Application.ActiveExplorer.Selection DoEvents If objItem.VotingResponse <> "" Then CountVar = CountVar + 1 Debug.Print " " & CountVar & ". " & objItem.SenderName Print #1, & objItem.SenderName & "," & objItem.VotingResponse Else CountVar = CountVar + 1 Debug.Print " " & CountVar & ". " & "Moving email from: " & Chr(34) & objItem.SenderName & Chr(34) & " to: Special Cases sub-folder" objItem.Move CurrentFolderVar.Folders("Special Cases") End If Next Close #1 Problem After this code runs through 250 emails, the following screenshot pops up: http://i.stack.imgur.com/yt9P8.jpg I've tried adding a "wait" function to give the server a rest so that I'm not querying it so quickly, but I get the same error at the same point.

    Read the article

  • Shared mailbox - users cannot create or view subfolders

    - by carlpett
    I've setup a shared mailbox on our Exchange 2010/SBS2011 server. I've added some users as Full permission-users on this mailbox, and when they open Outlook/login to OWA the mailbox is automatically opened. Great stuff. However, only the Inbox folder is visible, and the alternative to create a folder is grayed out. If they open the mailbox explicitly (for instance in OWA by clicking open other user's mailbox) they can see other folders, as well as create new ones. What configuration is needed to be able to view and create subfolders directly?

    Read the article

  • Set default owner/user

    - by Daniel Hollands
    I'm a web developer, and so have set-up an old machine in the office as an Ubuntu Server, for the purposes of testing websites. I've set-up LAMP and have created a /var/www folder, from which all my local sites are served. The issue is that of user permissions, i.e. any files that I copy into that folder (from my Windows machine via the network) automatically take on me (daniel) as their owner. The problem is that I want www-data to become the owner. I did some research and saw that it should be possible to use setuid (and setgid) to automatically set www-data as the owner of all files put into /var/www automatically, so far I've not had any luck making it work. Can someone help please? Thank you UPDATE: Would this do what I want it to do? Default file permissions for php user www-data UPDATE 2: I've kinda fixed my issue by changing my samba settings. Using Webmin, I was able to go in and change the default settings (as seen here: http://imageshack.us/photo/my-images/521/captureon.png/)

    Read the article

< Previous Page | 439 440 441 442 443 444 445 446 447 448 449 450  | Next Page >