Search Results

Search found 89638 results on 3586 pages for 'file table'.

Page 1503/3586 | < Previous Page | 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510  | Next Page >

  • How to import in BIDS more than one SSIS package in one shot!

    - by Luca Zavarella
    Have you ever wanted to add more than one Integration Services existing package (e.g. 20 packages) in a SSIS project? Well, you may suppose that an Open Dialog supports multiple files selection to import more than one file at a time ... BIDS Open Dialog doesn’t allow this, you can just select a single file! Hence the loss of valuable time spent to import the packages one at a time. Few days ago I learned a trick that solves the problem, thanks to this post by Matt Masson. Just copy all the packages to import from Windows Explorer (Ctrl + C): Then just right click on the SSIS Packages folder of the Integration Services project and make a simple Past (CTRL + V): So “auto-magically” you’ll have all those packages imported in your Integration Services project!! What can I say... this feature was well hidden!

    Read the article

  • How do I export customized Libreoffice config files?

    - by carestad
    Is this possible? I want to make my own config file for my customizations that I can apply whenever I reinstall my system. For example, Ubuntu's default font color is just stupid. I want it to be BLACK and not dark grey. And I want to turn on autosave every 3rd minute and backup files. Is there a config file that I can change? The .libreoffice/* folders and XML files doesn't make sense, and they don't seem to change when I change stuff in LibreOffice. Could someone please help me out with this? Thanks.

    Read the article

  • How can I tweak this split-tar-gz-archive script?

    - by Sai
    I came across this shell script that can be used to split an entire directory across multiple compressed files. I am interested in making a minor tweak to this script. Lets say I have n GB of files in a directory. I do not want the contents of a particular folder to be split across multiple tar files but inside the same file. I want the n MB folder to be fit within a single compressed file and the remaining (n GB -n MB) split across multiple files. I am not sure whether it is possible with this script. I was looking forward to some suggestions. Though the script has been well documented, it is quite complex to understand for me

    Read the article

  • MTP Won't Work With Newer Ubuntu

    - by spacesword
    I have a Philips GoGear Vibe 4 GB, set to MTP mode, as it's always been. This works fine with older versions of Ubuntu and works fine with Windows, but doesn't work with newer versions of Ubuntu. The version history is like this: 12.04 - works 12.10 - works 13.04 - doesn't work 14.04 - doesn't work Windows 7 - works When you plug the MP3 player in Ubuntu the file manager opens the root of the device, which contains the folder "internal storage". When you click on "internal storage" to open it, the file manager just hangs. And if you try to unmount the MP3 player, that process hangs too, until you just unplug it. In Windows when you click on internal storage, it opens and shows all the files it contains. And in the earlier versions of Ubuntu it just worked. Any ideas? Thanks.

    Read the article

  • How can I view an R32G32B32 texture?

    - by bobobobo
    I have a texture with R32G32B32 floats. I create this texture in-program on D3D11, using DXGI_FORMAT_R32G32B32_FLOAT. Now I need to see the texture data for debug purposes, but it will not save to anything but dds, showing the error in debug output, "Can't find matching WIC format, please save this file to a DDS". So, I write it to DDS but I can't open it now! The DirectX texture tool says "An error occurred trying to open that file". I know the texture is working because I can read it in the GPU and the colors seem correct. How can I view an R32G32B32 texture in an image viewer?

    Read the article

  • Executable execution path. Does it depends of the place the executable is called from?

    - by Valkea
    as I'm still a new Linux user, I still discover some behaviours and I'm unable to tell if they are "normals" or not. I searched the Internet but as I can't really find an answer I guess it's time to ask here. Few weeks ago I installed a small game called "Machinarium" and I played it... but few days later when I wanted to continue my game I was unable to make the game start correctly. And as I didn't had the time to search I given up. But yesterday as I was working on a program of mine, I had the exactly same behaviour. So I searched a bit and I discovered that when using Nautilus with the "List view", I was able to run the program (ie: the program does find the sound, images etc resources) when I was literally "inside" the executable folder, but unable when I was in a parent folder and expanding it to the executable folder to run it. To illustrate the behaviour here are two screen shots. It doesn't works if the executable is double clicked from here It does works if the executable is double clicked from here This is indeed the same "place", but the Nautilus view is slightly different as the current folder is not the same and it seems to make a difference for the program. Furthermore, when I create a menu items via System Settings/Main Menu to the executable, it behaves just like if the executable can't find the resources (that's why I was unable to play Machinarium the second time as I created a menu short-cut after my first game). So I asked my program to generate a text file at it's root when running, and I started to launch it from different "parent" folders to see where is generated the text file. Each time the file was generated on the top folder of the current Nautilus view. I was expected to see it appears in the same folder of the executable (well not as I was guessing what as happening, but before that I would have expected to see it appears in the exe folder). Does anyone can explain me why it does works like this (I guess it's normal) ? How I'm supposed to solve this when creating programs (Should I detect the executable path in my C++ code or should I organize the resources files another way than on windows ?)

    Read the article

  • How To: Spell Check InfoPath web form in SharePoint

    - by JeremyRamos
    This post is a compiled version of Steve Cavanagh's blog post on How To: Spell Check an InfoPath form displayed via XmlFormView. Many are not able to follow Steve's instructions due to lack of details. See below a downloadable zip of all changes need installed for your InfoPath Spell Checker. File Contents: CustomSpellCheckEntirePage.js - This is a customized SpellCheckEntirePage.js which includes changes outlined in Steve's post above.   FormServer.aspx - Note that this will replace the exisitng FormServer.aspx - this file acts like a masterpage for all infopath forms. So this change will add the spellchecker to all infopath forms in the sharepoint farm. Only thing i changed here is to add the 'Spell Check' link before and after the form.   ReadMe.rtf - Contains instructions where to copy the files to in your MOSS WFE server.

    Read the article

  • How can I switch between fglrx and ati drivers?

    - by Santa Maradona
    Hi there, How can I switch between fglrx and ati (open source) drivers without uninstalling fglrx? The thing is I'm using FGLRX on a daily basis, because it's better for movies--OS driver displays annoying horizontal line while playing movies. FGLRX also does this, but it's less visible. That's only disadvantage of OS driver. When I want to plug in external monitor with different resolution than my laptop's LCD I have to use open source driver, because FGLRX can't display two different resolutions at the same time. I can uninstall FGLRX, but it's rather inconvenient to do this all the time. I've noticed that when I'm using FGLRX there is xorg.conf file and when I uninstall it, then the file is missing. So my question is how to switch between them? It'll be perfect if I could do this without restarting my computer either. I'm using Ubuntu 10.10, laptop/netbook MSI U250.

    Read the article

  • Using Windows Previous Versions to access ZFS Snapshots (July 14, 2009)

    - by user12612012
    The Previous Versions tab on the Windows desktop provides a straightforward, intuitive way for users to view or recover files from ZFS snapshots.  ZFS snapshots are read-only, point-in-time instances of a ZFS dataset, based on the same copy-on-write transactional model used throughout ZFS.  ZFS snapshots can be used to recover deleted files or previous versions of files and they are space efficient because unchanged data is shared between the file system and its snapshots.  Snapshots are available locally via the .zfs/snapshot directory and remotely via Previous Versions on the Windows desktop. Shadow Copies for Shared Folders was introduced with Windows Server 2003 but subsequently renamed to Previous Versions with the release of Windows Vista and Windows Server 2008.  Windows shadow copies, or snapshots, are based on the Volume Snapshot Service (VSS) and, as the [Shared Folders part of the] name implies, are accessible to clients via SMB shares, which is good news when using the Solaris CIFS Service.  And the nice thing is that no additional configuration is required - it "just works". On Windows clients, snapshots are accessible via the Previous Versions tab in Windows Explorer using the Shadow Copy client, which is available by default on Windows XP SP2 and later.  For Windows 2000 and pre-SP2 Windows XP, the client software is available for download from Microsoft: Shadow Copies for Shared Folders Client. Assuming that we already have a shared ZFS dataset, we can create ZFS snapshots and view them from a Windows client. zfs snapshot tank/home/administrator@snap101zfs snapshot tank/home/administrator@snap102 To view the snapshots on Windows, map the dataset on the client then right click on a folder or file and select Previous Versions.  Note that Windows will only display previous versions of objects that differ from the originals.  So you may have to modify files after creating a snapshot in order to see previous versions of those files. The screenshot above shows various snapshots in the Previous Versions window, created at different times.  On the left panel, the .zfs folder is visible, illustrating that this is a ZFS share.  The .zfs setting can be toggled as desired, it makes no difference when using previous versions.  To make the .zfs folder visible: zfs set snapdir=visible tank/home/administrator To hide the .zfs folder: zfs set snapdir=hidden tank/home/administrator The following screenshot shows the Previous Versions panel when a file has been selected.  In this case the user is prompted to view, copy or restore the file from one of the available snapshots. As can be seen from the screenshots above, the Previous Versions window doesn't display snapshot names: snapshots are listed by snapshot creation time, sorted in time order from most recent to oldest.  There's nothing we can do about this, it's the way that the interface works.  Perhaps one point of note, to avoid confusion, is that the ZFS snapshot creation time isnot the same as the root directory creation timestamp. In ZFS, all object attributes in the original dataset are preserved when a snapshot is taken, including the creation time of the root directory.  Thus the root directory creation timestamp is the time that the directory was created in the original dataset. # ls -d% all /home/administrator         timestamp: atime         Mar 19 15:40:23 2009         timestamp: ctime         Mar 19 15:40:58 2009         timestamp: mtime         Mar 19 15:40:58 2009         timestamp: crtime         Mar 19 15:18:34 2009 # ls -d% all /home/administrator/.zfs/snapshot/snap101         timestamp: atime         Mar 19 15:40:23 2009         timestamp: ctime         Mar 19 15:40:58 2009         timestamp: mtime         Mar 19 15:40:58 2009         timestamp: crtime         Mar 19 15:18:34 2009 The snapshot creation time can be obtained using the zfs command as shown below. # zfs get all tank/home/administrator@snap101NAME                             PROPERTY  VALUEtank/home/administrator@snap101  type      snapshottank/home/administrator@snap101  creation  Mon Mar 23 18:21 2009 In this example, the dataset was created on March 19th and the snapshot was created on March 23rd. In conclusion, Shadow Copies for Shared Folders provides a straightforward way for users to view or recover files from ZFS snapshots.  The Windows desktop provides an easy to use, intuitive GUI and no configuration is required to use or access previous versions of files or folders. REFERENCES FOR MORE INFORMATION ZFS ZFS Learning Center Introduction to Shadow Copies of Shared Folders Shadow Copies for Shared Folders Client

    Read the article

  • How to Save Filters with PNG in Inkscape

    - by Uri Herrera
    I have created a graphic with multiple layers in Inkscape. One of the layers is some text. I have applied a drop shadow filter to the text. When I save the file as PNG, the drop shadow is not saved. I have also tried applying a gaussian blur to the text layer and to the text layer after converting it to an object. The blur is not applied. How can I save the file as PNG with the drop shadow intact?

    Read the article

  • Once mounted using TrueCrypt, cannot unmount

    - by zeiger
    I have an external HDD and use TrueCrypt for keeping encrypted file containers. After mounting, whenever I try to dismount a file container (using TrueCrypt 7.0a on Ubuntu 11.04), it just does not happen and I get the following message: device-mapper: remove ioctl failed: Device or resource busy Command failed Further, if I close TrueCrypt and then try to start it again, it says that TrueCrypt is already running, but I cannot access it from the Unity sidebar (because it is not there). Also, if I power down my external HDD, the TrueCrypt volume still shows as one of the mounted volumes, but I cannot do anything with it. Any possible work around? I remember this NOT happening in earlier versions of Ubuntu so I am guessing there is something to do with 11.04. Thanks

    Read the article

  • Failing report subscriptions

    - by DavidWimbush
    We had an interesting problem while I was on holiday. (Why doesn't this stuff ever happen when I'm there?) The sysadmin upgraded our Exchange server to Exchange 2010 and everone's subscriptions stopped. My Subscriptions showed an error message saying that the email address of one of the recipients is invalid. When you create a subscription, Reporting puts your Windows user name into the To field and most users have no permissions to edit it. By default, Reporting leaves it up to exchange to resolve that into an email address. This only works if Exchange is set up to translate aliases or 'short names' into email addresses. It turns out this leaves Exchange open to being used as a relay so it is disabled out of the box. You now have three options: Open up Exchange. That would be bad. Give all Reporting users the ability to edit the To field in a subscription. a) They shouldn't have to, it should just work. b) They don't really have any business subscribing anyone but themselves. Fix the report server to add the domain. This looks like the right choice and it works for us. See below for details. Pre-requisites: A single email domain name. A clear relationship between the Windows user name and the email address. eg. If the user name is joebloggs, then joebloggs@domainname needs to be the email address or an alias of it. Warning: Saving changes to the rsreportserver.config file will restart the Report Server service which effectively takes Reporting down for around 30 seconds. Time your action accordingly. Edit the file rsreportserver.config (most probably in the folder ..\Program Files[ (x86)]\Microsoft SQL Server\MSRS10_50[.instancename]\Reporting Services\ReportServer). There's a setting called DefaultHostName which is empty by default. Enter your email domain name without the leading '@'. Save the file. This domain name will be appended to any destination addresses that don't have a domain name of their own.

    Read the article

  • Adventures in Lab Management Configuration: Part 2 of 3

    - by Enrique Lima
    The first post was the high level overview. Now it is time for the details on what was done to the existing CMMI Project based on CMMI v 4.2. The first step was to go into Visual Studio, then from the Team Project Collection Settings and then to the Process Template Manager.  Once there, it was a matter of selecting the appropriate template (MSF for CMMI Process Improvement v5.0) and download to a point I could reference later (for example C:\Templates). Then on to using the steps from the guidance post. Since I was using an x64 deployment, I will make reference to the path as <toolpath>, however the actual path to reference in a 64-bit environment is “C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE”. As I mentioned on the previous post, make sure to first perform a backup of the Configuration, Collection and Warehouse DBs.  If you did not apply any changes to the names and such, then you will find those as tfs_Configuration, tfs_DefaultCollection and tfs_Warehouse. Now, the work needed with the witadmin tool: That includes the uploading of the structures that differ from v4.2 to v5.0 There is likely going to be an issue with the naming of some fields. For example, TFS 2010 likes something along the lines of “Area ID”, whereas TFS 2008 would have had it as “AreaID”.  So, this will need to be corrected.  Some posts will have you go through this after the errors pop up.  I would recommend doing this process prior to executing the importwitd process.  witadmin listfields /collection:<path to collection> > c:\ListFields.txt Review the following fields: AreaID, review the Name property and validate if it states “AreaID”, the you will need to rename the Name field to reflect “Area ID”. ExternalLinkCount, RelatedLinkCount, HyperLinkCount, AttachedFileCount and IterationID would be the other fields to check. To correct the issue, then execute the following: witadmin changefield /collection:<path to collection> /n:"System.ExternalLinkCount" /name:"External Link Count" Repeat for Area ID, Related Link Count, Hyperlink Count, Attached File Count and Iteration ID.  Once this is done, proceed with the commands below. witadmin importwitd /collection:<path to collection> /p:<project> /f:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\WorkItem Tracking\TypeDefinitions\TestCase.xml" witadmin importwitd /collection:<path to collection> /p:<project> /f:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\WorkItem Tracking\TypeDefinitions\SharedStep.xml" witadmin importcategories /collection:<path to collection> /p:<project> /f:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\WorkItem Tracking\categories.xml" Modifications to the Bug Definition: First step is to export the existing definition. witadmin exportwitd /collection<path to collection> /p:<project> /n:bug /f:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\MyBug.xml" Make modifications to recently exported MyBug.xml file.  Details for the modification are here:  http://msdn.microsoft.com/en-us/library/ff452591.aspx#ModifyTask Once the changes are done, proceed with the import command witadmin importwitd /collection:<path to collection> /p: <project> /f:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\MyBug.xml" Repeat the process for the the Scenario or Requirement Type Definition witadmin exportwitd /collection<path to collection> /p:<project> /n:requirement /f:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\MyRequirement.xml" Make modifications to recently exported MyRequirement.xml file.  Details for the modification are here:  http://msdn.microsoft.com/en-us/library/ff452591.aspx#ModifyTask Once the changes are done, proceed with the import command witadmin importwitd /collection:<path to collection> /p: <project> /f:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\MyRequirement.xml" Provide the Bug Field Mapping definition, after creating the file as specified here: http://msdn.microsoft.com/en-us/library/ff452591.aspx#TCMBugFieldMapping tcm bugfieldmapping /import /mappingfile:"<path to downloaded template>\MSF for CMMI Process Improvement v5.0\bugfieldmappings.xml" /collection:<path to collection> /teamproject:<project name>

    Read the article

  • Java Plugin - Firefox

    - by Tomassino
    Having trouble getting Java to work with Firefox (22), I have followed the advice in this question and on the official Java site, but nothing seems to work. I have the latest Java (1.7.0_25) in /opt/java and have set a symlink in /usr/libs/mozilla/plugins for the libnpjp2.so file. I can see the file in the terminal and Java runs fine. However Firefox shows nothing in about:plugins. I have also run export JAVA_HOME="/opt/java/jre1.7.0_05/bin/java" to be on the safe side. I know there are multiple plugin directories such as /usr/lib/firefox/plugins and /usr/lib/firefox-addons/plugins, but all my current plugins show they are located in /usr/lib/mozilla/plugins when viewing the about:plugins page. A bit stuck on where to go next?

    Read the article

  • No bass from the speakers

    - by Bhavesh Jogadia
    In Ubuntu 11.10 no bass sound at all when try to play mp3 or 2channels audio. I have 5.1/6 channels speakers. When I test speakers from the sound preference it works perfectly fine and then I try to play any MP3 there is no bass only the speakers work, I play 5.1 movies it plays fine bass sounds good. Also tried to to some changes as instructed with deamon.conf file but no go... When I turn my speakers on play speakers only mode it plays the bass but sound quality is not good compared to normal playing. I have a Creative 5.1 vx ca0160 sound card. In Windows also had the same problem unless I do bass redirection crossover frequency so is there any kinda software package or any kinda changes i can make in system file so that my speaker bass works fine or any thing who can let me change the bass redirection crossover frequency?

    Read the article

  • WebLogic not reading boot.properties 11.1.1.x

    - by James Taylor
    In WebLogic 11.1.1.1 the boot.properties file was stored in the $MW_HOME/user_projects/domains/[domain] directory. It would be read at startup and there would be no requirement to enter username and password. In later releases the location has changed to $MW_HOME/user_projects/domains/[domain]/servers/[managed_server]/security In most instances you will need to create the security directory If you want to specify a custom directory add the following to the startup scripts for the server. -Dweblogic.system.BootIdentityFile=[loc]/boot.properties create a boot.properties file using the following entry username=<adminuser> password=<password>

    Read the article

  • CodePlex Daily Summary for Saturday, November 02, 2013

    CodePlex Daily Summary for Saturday, November 02, 2013Popular ReleasesWsus Package Publisher: Release v1.3.1311.02: Add three new Actions in Custom Updates : Work with Files (Copy, Delete, Rename), Work with Folders (Add, Delete, Rename) and Work with Registry Keys (Add, Delete, Rename). Fix a bug, where after resigning an update, the display is not refresh. Modify the way WPP sort rows in 'Updates Detail Viewer' and 'COmputer List Viewer' so that dates are correctly sorted. Add a Tab in the settings form to set Proxy settings when WPP needs to go on Internet. Fix a bug where 'Manage Catalogs Subsc...uComponents: uComponents v6.0.0: This release of uComponents will compile against and support the new API in Umbraco v6.1.0. What's new in uComponents v6.0.0? New DataTypesImage Point XML DropDownList XPath Templatable List New features / Resolved issuesThe following workitems have been implemented and/or resolved: 14781 14805 14808 14818 14854 14827 14868 14859 14790 14853 14790 DataType Grid 14788 14810 14873 14833 14864 14855 / 14860 14816 14823 Drag & Drop support for rows Su...SmartStore.NET - Free ASP.NET MVC Ecommerce Shopping Cart Solution: SmartStore.NET 1.2.1: New FeaturesAdded option Limit to current basket subtotal to HadSpentAmount discount rule Items in product lists can be labelled as NEW for a configurable period of time Product templates can optionally display a discount sign when discounts were applied Added the ability to set multiple favicons depending on stores and/or themes Plugin management: multiple plugins can now be (un)installed in one go Added a field for the HTML body id to store entity (Developer) New property 'Extra...CodeGen Code Generator: CodeGen 4.3.2: Changes in this release include: Removed old tag tokens from several example templates. Fixed a bug which was causing the default author and company names not to be picked up from the registry under .NET. Added several additional tag loop expressions: <IF FIRST_TAG>, <IF LAST_TAG>, <IF MULTIPLE_TAGS> and<IF SINGLE_TAG>. Upgraded to Synergy/DE 10.1.1b, Visual Studio 2013 and Installshield Limited Edition 2013.Community Forums NNTP bridge: Community Forums NNTP Bridge V54 (LiveConnect): This is the first release which can be used with the new LiveConnect authentication. Fixes the problem that the authentication will not work after 1 hour. Also a logfile will now be stored in "%AppData%\Community\CommunityForumsNNTPServer". If you have any problems please feel free to sent me the file "LogFile.txt".Aricie - Friendlier Url Provider: Aricie - Friendlier Url Provider Version 2.5.3: This is mainly a maintenance release to stabilize the new Url Group paradigm. As usual, don't forget to install the Aricie - Shared extension first Highlights Fixed: UI bugs Min Requirements: .Net 3.5+ DotNetNuke 4.8.1+ Aricie - Shared 1.7.7+Aricie Shared: Aricie.Shared Version 1.7.7: This is mainly a maintenance version. Fixes in Property Editor: list import/export Min Requirements: DotNetNuke 4.8.1+ .Net 3.5+WPF Extended DataGrid: WPF Extended DataGrid 2.0.0.9 binaries: Fixed issue with ICollectionView containg null values (AutoFilter issue)Community TFS Build Extensions: October 2013: The October 2013 release contains Scripts - a new addition to our delivery. These are a growing library of PowerShell scripts to use with VS2013. See our documentation for more on scripting. VS2010 Activities(target .NET 4.0) VS2012 Activities (target .NET 4.5) VS2013 Activities (target .NET 4.5.1) Community TFS Build Manager VS2012 Community TFS Build Manager VS2013 The Community TFS Build Managers for VS2010, 2012 and 2013 can also be found in the Visual Studio Gallery where upda...SuperSocket, an extensible socket server framework: SuperSocket 1.6 stable: Changes included in this release: Process level isolation SuperSocket ServerManager (include server and client) Connect to client from server side initiatively Client certificate validation New configuration attributes "textEncoding", "defaultCulture", and "storeLocation" (certificate node) Many bug fixes http://docs.supersocket.net/v1-6/en-US/New-Features-and-Breaking-ChangesBarbaTunnel: BarbaTunnel 8.1: Check Version History for more information about this release.Mugen MVVM Toolkit: Mugen MVVM Toolkit 2.1: v 2.1 Added the 'Should' class instead of the 'Validate' class. The 'Validate' class is now obsolete. Added 'Toolkit.Annotations' to support the Mugen MVVM Toolkit ReSharper plugin. Updated JetBrains annotations within the project. Added the 'GlobalSettings.DefaultActivationPolicy' property to represent the default activation policy. Removed the 'GetSettings' method from the 'ViewModelBase' class. Instead of it, the 'GlobalSettings.DefaultViewModelSettings' property is used. Updated...NAudio: NAudio 1.7: full release notes available at http://mark-dot-net.blogspot.co.uk/2013/10/naudio-17-release-notes.htmlDirectX Tool Kit: October 2013: October 28, 2013 Updated for Visual Studio 2013 and Windows 8.1 SDK RTM Added DGSLEffect, DGSLEffectFactory, VertexPositionNormalTangentColorTexture, and VertexPositionNormalTangentColorTextureSkinning Model loading and effect factories support loading skinned models MakeSpriteFont now has a smooth vs. sharp antialiasing option: /sharp Model loading from CMOs now handles UV transforms for texture coordinates A number of small fixes for EffectFactory Minor code and project cleanup ...ExtJS based ASP.NET Controls: FineUI v4.0beta1: +2013-10-28 v4.0 beta1 +?????Collapsed???????????????。 -????:window/group_panel.aspx??,???????,???????,?????????。 +??????SelectedNodeIDArray???????????????。 -????:tree/checkbox/tree_checkall.aspx??,?????,?????,????????????。 -??TimerPicker???????(????、????ing)。 -??????????????????????(???)。 -?????????????,??type=text/css(??~`)。 -MsgTarget???MessageTarget,???None。 -FormOffsetRight?????20px??5px。 -?Web.config?PageManager??FormLabelAlign???。 -ToolbarPosition??Left/Right。 -??Web.conf...CODE Framework: 4.0.31028.0: See change notes in the documentation section for details on what's new. Note: If you download the class reference help file with, you have to right-click the file, pick "Properties", and then unblock the file, as many browsers flag the file as blocked during download (for security reasons) and thus hides all content.VidCoder: 1.5.10 Beta: Broke out all the encoder-specific passthrough options into their own dropdown. This should make what they do a bit more clear and clean up the codec list a bit. Updated HandBrake core to SVN 5855.Indent Guides for Visual Studio: Indent Guides v14: ImportantThis release has a separate download for Visual Studio 2010. The first link is for VS 2012 and later. Version History Changed in v14 Improved performance when scrolling and editing Fixed potential crash when Resharper is installed Fixed highlight of guides split around pragmas in C++/C# Restored VS 2010 support as a separate download Changed in v13 Added page width guide lines Added guide highlighting options Fixed guides appearing over collapsed blocks Fixed guides not...ASP.net MVC Awesome - jQuery Ajax Helpers: 3.5.3 (mvc5): version 3.5.3 - support for mvc5 version 3.5.2 - fix for setting single value to multivalue controls - datepicker min max date offset fix - html encoding for keys fix - enable Column.ClientFormatFunc to be a function call that will return a function version 3.5.1 ========================== - fixed html attributes rendering - fixed loading animation rendering - css improvements version 3.5 ========================== - autosize for all popups ( can be turned off by calling in js...Media Companion: Media Companion MC3.585b: IMDB plot scraping Fixed. New* Movie - Rename Folder using Movie Set, option to move ignored articles to end of Movie Set, only for folder renaming. Fixed* Media Companion - Fixed if using profiles, config files would blown up in size due to some settings duplicating. * Ignore Article of An was cutting of last character of movie title. * If Rescraping title, sort title changed depending on 'Move article to end of Sort Title' setting. * Movie - If changing Poster source order, list would beco...New ProjectsCarTrade.dk: program that will help make it easier for cardealers to do trade cars amongst themselvesCMSPORTAL: Nothing CruxOMatic Contributions: Crux-O-Matic-Contrib is the contribution project where the Crux-O-Matic community can contribute Crux-O-Matic components and extenders.Dynamics CRM 2013 Easy Solution Importer: Dynamics CRM 2013 Easy Solution ImporterGlobal Excel Automation PowerShell Library: The goal of the library is to automate common infrastructure tasks.HappyBin AutoUpdater: HappyBin is an auto-updater for .NET apps. It is designed as an api, and can be used as a boot-strap or a passive updater. Every app deserves a HappyBin!HashTag WCF Membership and Role Provider: Membership provider for using WCF to connect to a hosted membership endpont. Also includes comprehensive tests for custom membership and role providers.Luwx-Mobile-Store: Xây d?ng trang web bán di?n tho?i trên MVC4Pokémon Bank Online: Pokémon Bank for PBO/POQuickMatch Game: CLI multiplayer game on a single console.seizyUtils: SQL Mapping For Windows Embedded Compact SharePoint 2013 Global Metadata Navigation: Deploy your SharePoint 2013 Managed Metadata navigation term set to an unlimited number of site collections using JQuery, SCOM, CSS, and a custom master page.Simple ASP.NET MVC 3: a simple blog - Guest: view posts and add comments. - Admin: view, edit, delete posts and delete comments. User must be log in.SPRepository: SPRepository aims to be a generic Repository Design Pattern implementation for accessing SharePoint objects (but is not inherently SP).WebSocket Security: WebSocket SecutrityWildfire Ttraining Server: Server for application that tracks Wildfire around the world??? ???? 2013: ??????? ? ?????? ????? ???????????? 2013

    Read the article

  • Ars Technica .ars URL suffix -- Vanity or SEO Benefit?

    - by yc01
    The Technology website Ars Technica has adjusted their URL rewrite rules to end with a .ars. Traditionally, sites have taken advantage of this URL rewriting capability to completely eliminate file suffixes like .html, .php, .aspx etc, under the theory that this made for better SEO (since the content of the URL was more relevant to the content) Ars Technicas, though, look like this: http://arstechnica.com/science/news/2011/03/flow-from-the-poles-drive-sunspot-levels.ars So, is Ars Technica adding the .ars file suffix purely a vanity play? Or is it an SEO trick to improve the site's SEO by cleverly inserting their site name into every URL slug? And, if this is indeed an effective SEO trick, should other sites follow suit?

    Read the article

  • sysctl.conf ignore net settings

    - by Steffen Unland
    I have a little problem with sysctl on a Ubuntu 10.04 LTS system. When I set the sysctl values with "sysctl -w " all work fine, but when I try to use the sysctl.conf file. the net settings will be ignored. For example my sysctl.conf # /etc/sysctl.conf - Configuration file for setting system variables kernel.domainname=findme.sysctl # Corefiles information fs.suid_dumpable=2 kernel.core_pattern=/cores/core-%e-%s-%u-%g-%p-%t ##############################################################3 # Functions previously found in netbase net.ipv4.netfilter.ip_conntrack_tcp_timeout_fin_wait=1 net.ipv4.netfilter.ip_conntrack_tcp_timeout_close_wait=1 when I grep to the values, I can see that the sysctl settings for net.ipv4.netfilter don't set. [host:~ ] $ sysctl -a | grep domainname kernel.domainname = findme.sysctl [host:~ ] $ sysctl -a | grep "core_pattern" kernel.core_pattern = /cores/core-%e-%s-%u-%g-%p-%t [host:~ ] $ sysctl -a | grep "timeout_fin_wait" net.netfilter.nf_conntrack_tcp_timeout_fin_wait = 120 net.ipv4.netfilter.ip_conntrack_tcp_timeout_fin_wait = 120 [host:~ ] $ sysctl -a | grep "timeout_close_wait" net.netfilter.nf_conntrack_tcp_timeout_close_wait = 60 net.ipv4.netfilter.ip_conntrack_tcp_timeout_close_wait = 60 can somebody help me to solve the problem? If you need more information I can post it. Cheers, Steffen

    Read the article

  • Problems creating a debdiff

    - by Chris Wilson
    I'm following this guide to create a debdiff for a package I'm patching. Everything goes fine until step number 8 and I attempt to create the debdiff after committing the changes. The package in question is Zim, pulled form Launchpad using bzr branch lp:zim and according to this guide I should execute the following command to create the debdiff: debdiff zim_0.49.dsc zim_0.49ubuntu1.dsc > zim_0.49ubuntu1.debdiff however, when I actually try to execute this command, I get the following error: debdiff: fatal error at line 314: Can't read file: zim_0.49.dsc Upon inspection of the directory in which the files created from debuild -S (step 6) are deposited, I find zim_0.49ubuntu1_source.changes zim_0.49ubuntu1.dsc zim_0.49ubuntu1.tar.gz zim_0.49ubuntu1_source.build but no sign of zim_0.49.dsc. I could probably create one by debuilding the package as soon as I check out the code, before starting work, but that would add an extraneous entry in the changelog. Is there a step missing from the guide that creates zim_0.49.dsc or is the file itself missing from the source?

    Read the article

  • RewriteRule not working at server level?

    - by Alexis Wilke
    I wanted to forbid some robots from doing certain things to my websites and decided to add a RewriteRule for that purpose. The rule works when put in one of my <VirtualHost *:80> tag and looks like this: RewriteEngine On RewriteCond %{HTTP_USER_AGENT} libwww-perl RewriteCond %{REQUEST_METHOD} POST RewriteRule . - [F,L] However, I wanted to apply that to all my websites instead of just one of them. So with the newest version of Apache2 settings, I decided to put that code in the security.conf file. This file is defined under /etc/apache2/conf-available/... (and yes, I have a softlink from the /etc/apache2/conf-enabled/... directory.) However, if the definition is only in the conf-available/security.conf files, it somehow gets ignored. From the documentation, it says that these Rewrite* commands all work at server level! Any idea of what I would be missing?

    Read the article

  • Week in Geek: Rogue Antivirus Caught Using AVG Logo Edition

    - by Asian Angel
    This week we learned how to quickly cut a clip from a video file with Avidemux, “tile windows, remote control a desktop using an iOS device, taking advantage of Windows 7 libraries”, turn a home Ubuntu PC into a LAMP web server, enable desktop notifications for Gmail in Chrome, “what image channels are and what they mean”, and more Latest Features How-To Geek ETC How to Integrate Dropbox with Pages, Keynote, and Numbers on iPad RGB? CMYK? Alpha? What Are Image Channels and What Do They Mean? How to Recover that Photo, Picture or File You Deleted Accidentally How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition Android Notifier Pushes Android Notices to Your Desktop Dead Space 2 Theme for Chrome and Iron Carl Sagan and Halo Reach Mashup – We Humans are Capable of Greatness [Video] Battle the Necromorphs Once Again on Your Desktop with the Dead Space 2 Theme for Windows 7 HTC Home Brings HTC’s Weather Widget to Your Windows Desktop Apps Uninstall Batch Removes Android Applications

    Read the article

  • Desktop Fun: Happy New Year Wallpaper Collection [Bonus Edition]

    - by Asian Angel
    As this year draws to a close, it is a time to reflect back on what we have done this year and to look forward to the new one. To help commemorate the event we have put together a bonus size edition of Happy New Year wallpapers for your desktops. Extra Note: We made a special effort to find wallpapers for this collection without the year “printed” on them, thus allowing for reuse as desired and/or needed beyond the 2010 – 2011 holiday. Note: Click on the picture to see the full-size image—these wallpapers vary in size so you may need to crop, stretch, or place them on a colored background in order to best match them to your screen’s resolution. For more New Year’s desktop goodness be sure to check out our Happy New Year icon & font packs collection (link at bottom)! Note: This wallpaper will need to be placed on a larger white background in order to increase the height. Note: This wallpaper will need to be placed on a larger background in order to increase the width and height. Note: This wallpaper comes in multiple sizes and will need to be downloaded as a zip file. Note: This wallpaper comes in multiple sizes and will need to be downloaded as a zip file. Note: The download size for the original version of this wallpaper is 15 MB. Note: The download size for the original version of this wallpaper is 15 MB. More Happy New Year Fun Desktop Fun: Happy New Year Icon and Font Packs For more wallpapers be certain to see our great collections in the Desktop Fun section. Latest Features How-To Geek ETC How to Use the Avira Rescue CD to Clean Your Infected PC The Complete List of iPad Tips, Tricks, and Tutorials Is Your Desktop Printer More Expensive Than Printing Services? 20 OS X Keyboard Shortcuts You Might Not Know HTG Explains: Which Linux File System Should You Choose? HTG Explains: Why Does Photo Paper Improve Print Quality? The Outdoor Lights Scene from National Lampoon’s Christmas Vacation [Video] The Famous Home Alone Pizza Delivery Scene [Classic Video] Chronicles of Narnia: The Voyage of the Dawn Treader Theme for Windows 7 Cardinal and Rabbit Sharing a Tree on a Cold Winter Morning Wallpaper An Alternate Star Wars Christmas Special [Video] Sunset in a Tropical Paradise Wallpaper

    Read the article

  • First Foray&ndash;About timeout

    - by SQLMonger
    It has been quite a while since I signed up for this blog site and high time that something was posted.  I have a list of topics that I will be working through and posting.  Some I am sure will have been posted by others, but I will be sticking to the technical problems and challenges that I’ve recently faced, and the solutions that worked for me.  My motto when learning something new has always been “My kingdom for an example!”, and I plan on delivering useful examples here so others can learn from my efforts, failures and successes.   A bit of background about me… My name is Clayton Groom. I am a founding partner of a consulting firm in St. Louis Missouri, Covenant Technology Partners, LLC and focus on SQL Server Data Warehouse design, Analysis Services and Enterprise Reporting solutions.  I have been working with SQL Server since the early nineties, when it still only ran on OS/2. I love solving puzzles and technical challenges.   Enough about me… On to a real problem… SSIS Connection Time outs versus Command Time outs Last week, I was working on automating the processing for a large Analysis Services cube.  I had reworked an SSIS package and script task originally posted by Vidas Matelis that automates the process of adding new and dropping old partitions to/from an Analysis Services cube.  I had the package working great, tested, and ready for deployment.  It basically performs a query against the source system to determine if there is new data in the warehouse that will require a new partition to be added to the cube, and it checks the cube to see if there are any partitions that are present that are no longer needed in a rolling 60 month window. My client uses Tivoli for running all their production jobs, and not SQL Agent, so I had to build a command line file for Tivoli to use to run the package. Everything was going great. I had tested the command file from my development workstation using an XML configuration file to pass in server-specific parameters into the package when executed using the DTExec utility. With all the pieces ready, I updated the dtsconfig file to point to the UAT environment and started working with the Tivoli developer to test the job.  On the first run, the job failed, and from what I could see in the SSIS log, it had failed because of a timeout. Other errors in the log made me think that perhaps the connection string had not been passed into the package correctly. We bumped the Connection Manager  timeout values from 20 seconds to 120 seconds and tried again. The job still failed. After changing the command line to use the /SET option instead of the /CONFIGFILE option, we tested again, and again failure. After a number more failed attempts, and getting the Teradata DBA involved to monitor and see if we were connecting and failing or just failing to connect, we determined that the job was indeed connecting to the server and then disconnecting itself after 30 seconds.  This seemed odd, as we had the timeout values for the connection manager set to 180 seconds by then.  At this point one of the DBA’s found a post on the Teradata forum that had the clues to the puzzle: There is a separate “CommandTimeout” custom property on the Data source object that may needed to be adjusted for longer running queries.  I opened up the SSIS package, opened the data flow task that generated the partition list table and right-clicked on the data source. from the context menu, I selected “Show Advanced Editor” and found the property. Sure enough, it was set to 30 seconds. The CommandTimeout property can also be edited in the SSIS Properties sheet. In order to determine how long the timeout needed to be, I ran the query from the task in the development environment and received a response in a matter of seconds.  I then tried the same query against the production database and waited several minutes for a response. This did not seem to be a reasonable response time for the query involved, and indeed it wasn’t. The Teradata DBA’s adjusted the query governor settings for the service account I was testing with, and we were able to get the response back down under a minute.  Still, I set the CommandTimeout property to a much higher value in case the job was ever started during a time of high-demand on the production server. With this change in place, the job finally completed successfully.  The lesson learned for me was two-fold: Always compare query execution times between development and production environments, and don’t assume that production will always be faster.  With higher user demands, query governors, and a whole lot more data, the execution time of even what might seem to be simple queries can vary greatly. SSIS Connection time out settings do not affect command time outs.  Connection timeouts control how long the package will wait for a response from the server before assuming the server is not available or is not responding. Command time outs control how long a task will wait for results to start being returned before deciding that the server is not responding. Both lessons seem pretty straight forward, and I felt pretty sheepish once I finally figured out what the issue was.  To be fair though, In the 5+ years that I have been working with SSIS, I could only recall one other time where I had to set the CommandTimeout property, and that memory only resurfaced while I was penning this post.

    Read the article

  • How to clone a VirtualBox Disk

    - by [email protected]
     How to clone a VirtualBox DiskCopying the image of Virtual Disk (.vdi file) is a convenient way to duplicate the disk, in cases you want to avoid re-installing an operating system from scratch. However, simply copying the .vdi file into another location will make a verbatim copy of the virtual disk, including the UUID of the disk. If you try to add the copy in the Virtual Media Manager, you will get an error like this:In this case, you have to do is to clone the vdi disk: cd C:\Program Files\Sun\VirtualBox\C:\Program Files\Sun\VirtualBox>vboxmanage clonevdi G:\VMWARES\Database\11GR2onOEL5forVbox\11GR2_OEL5_32GB.vdi G:\VMWARES\Database\11GR2onOEL5forVbox\OEL5_32GB.vdi$ VBoxManage clonevdi Master.vdi Clone.vdiIn case you receive a error like this. It means that the disk is already a copy of other VirtualBox Disk.In that case you chould change the UUID before to clone the Disk.Follow the steps given here in order to do that.

    Read the article

< Previous Page | 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510  | Next Page >