Search Results

Search found 40122 results on 1605 pages for 'ms access 2007'.

Page 512/1605 | < Previous Page | 508 509 510 511 512 513 514 515 516 517 518 519  | Next Page >

  • Adding Column to a SQL Server Table

    - by Dinesh Asanka
    Adding a column to a table is  common task for  DBAs. You can add a column to a table which is a nullable column or which has default values. But are these two operations are similar internally and which method is optimal? Let us start this with an example. I created a database and a table using following script: USE master Go --Drop Database if exists IF EXISTS (SELECT 1 FROM SYS.databases WHERE name = 'AddColumn') DROP DATABASE AddColumn --Create the database CREATE DATABASE AddColumn GO USE AddColumn GO --Drop the table if exists IF EXISTS ( SELECT 1 FROM sys.tables WHERE Name = 'ExistingTable') DROP TABLE ExistingTable GO --Create the table CREATE TABLE ExistingTable (ID BIGINT IDENTITY(1,1) PRIMARY KEY CLUSTERED, DateTime1 DATETIME DEFAULT GETDATE(), DateTime2 DATETIME DEFAULT GETDATE(), DateTime3 DATETIME DEFAULT GETDATE(), DateTime4 DATETIME DEFAULT GETDATE(), Gendar CHAR(1) DEFAULT 'M', STATUS1 CHAR(1) DEFAULT 'Y' ) GO -- Insert 100,000 records with defaults records INSERT INTO ExistingTable DEFAULT VALUES GO 100000 Before adding a Column Before adding a column let us look at some of the details of the database. DBCC IND (AddColumn,ExistingTable,1) By running the above query, you will see 637 pages for the created table. Adding a Column You can add a column to the table with following statement. ALTER TABLE ExistingTable Add NewColumn INT NULL Above will add a column with a null value for the existing records. Alternatively you could add a column with default values. ALTER TABLE ExistingTable Add NewColumn INT NOT NULL DEFAULT 1 The above statement will add a column with a 1 value to the existing records. In the below table I measured the performance difference between above two statements. Parameter Nullable Column Default Value CPU 31 702 Duration 129 ms 6653 ms Reads 38 116,397 Writes 6 1329 Row Count 0 100000 If you look at the RowCount parameter, you can clearly see the difference. Though column is added in the first case, none of the rows are affected while in the second case all the rows are updated. That is the reason, why it has taken more duration and CPU to add column with Default value. We can verify this by several methods. Number of Pages The number of data pages can be obtained by using DBCC IND command. Though, this an undocumented dbcc command, many experts are ok to use this command in production. However, since there is no official word from Microsoft, use this “at your own risk”. DBCC IND (AddColumn,ExistingTable,1) Before Adding the Columns 637 Adding a Column with NULL 637 Adding a column with DEFAULT value 1270 This clearly shows that pages are physically modified. Please note, a high value indicated in the Adding a column with DEFAULT value  column is also a result of page splits. Continues…

    Read the article

  • SB Timmy

    - by csharp-source.net
    SB Timmy is IMAP mail client for WAP/WML devices. It's written in C#/ASP.NET (works both with MS .NET Framework and Mono). Timmy handles all types of MIME (base64, quoted-printable encoded; multipart messages). It can send mail through SMTP. It's possible to download message attachments to your mobile device (like JPEG photos). Timmy is multi-language (currently english and lithuanian translations).

    Read the article

  • What's wrong with my htaccess ? (500 Error)

    - by Dany Khalife
    I've written a small htaccess file to redirect Internet Explorer users to a specific page Here are the contents : # MS Internet Explorer - Mozilla v4 RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4(.*)MSIE RewriteRule ^index\.php$ /sorry.php [L] # All other browsers #RewriteRule ^index\.html$ /index.32.html [L] Any clue why this would give a 500 Internal Server Error ? I have used mod rewrite before so i have the module loaded there...

    Read the article

  • Solving Inbound Refinery PDF Conversion Issues, Part 1

    - by Kevin Smith
    Working with Inbound Refinery (IBR)  and PDF Conversion can be very frustrating. When everything is working smoothly you kind of forgot it is even there. Documents are cheeked into WebCenter Content (WCC), sent to IBR for conversion, converted to PDF, returned to WCC, and viola your Office documents have a nice PDF rendition available for viewing. Then a user checks in a bunch of password protected Word files, the conversions fail, your IBR queue starts backing up, users start calling asking why their document have not been released yet, and your spend a frustrating afternoon trying to recover and get things back running properly again. Password protected documents are one cause of PDF conversion failures, and I will cover those in a future blog post, but there are many other problems that can cause conversions to fail, especially when working with the WinNativeConverter and using the native applications, e.g. Word, to convert a document to PDF. There are other conversion options like PDFExportConverter which uses Oracle OutsideIn to convert documents directly to PDF without the need for the native applications. However, to get the best fidelity to the original document the native applications must be used. Many customers have tried PDFExportConverter, but have stayed with the native applications for conversion since the conversion results from PDFExportConverter were not as good as when the native applications are used. One problem I ran into recently, that at least has a easy solution, are Word documents that display a Show Repairs dialog when the document is opened. If you open the problem document yourself you will see this dialog. This will cause the conversion to time out. Any time the native application displays a dialog that requires user input the conversion will time out. The solution is to set add a setting for BulletProofOnCorruption to the registry for the user running Word on the IBR server. See this support note from Microsoft for details. The support note says to set the registry key under HKEY_CURRENT_USER, but since we are running IBR as a service the correct location is under HKEY_USERS\.DEFAULT. Also since in our environment we were using Office 2007, the correct registry key to use was: HKEY_USERS\.DEFAULT\Software\Microsoft\Office\11.0\Word\Options Once you have done this restart the IBR managed server and resubmit your problem document. It should now be converted successfully. For more details on IBR see the Oracle® WebCenter Content Administrator's Guide for Conversion.

    Read the article

  • Fonts look squashed or stretched in the browser on Ubuntu

    - by Arjun Menon
    Fonts in the browser on Ubuntu look look squashed/stretched compared to Windows/OSX. This image shows exactly what I mean: http://i.stack.imgur.com/suUXX.png I installed msttcorefonts and configured both Chrome & FF to use Microsoft fonts (Arial, Times New Roman) instead of the default ones. While MS fonts made web pages appear a bit different, regardless of what font it was the squashed/stretched look remained. FreeSans looks a little different from Arial, but it too is rendered squashed/stretched like Arial on both FF & Chrome. Opera renders the Wikipedia page differently from FF & Chrome, but the fonts looks squashed/stretched on it as well. I used to run Kubuntu prior to switching to Ubuntu and at some point I managed to get the fonts on Chrome (only Chrome) look exactly like in the image on the left. I have no idea how I did it though. Firefox and Rekonq retained the squashed/stretched look. I had been using Rekonq for a while, then switched to FF. While using both browsers I had done various things to get the fonts to look better on them with no success - like installing MS fonts & configuring both browsers to use them. I then, after some time, installed Chrome and the fonts magically looked perfect on them - just like on right-hand side of the image. In fact, the font smoothing looked better (to my eye) compared to Windows and OSX. All 3 OSes use subtly different font smoothing strategies and the differences stand out. Later, I formatted & installed Ubuntu 12.04. The first thing I did was install msttcorefonts & then install Chrome. To my dismay, the fonts on Chrome looked just as squashed/stretched as it did in Firefox. There's no browser (except Wine Internet Explorer) that renders fonts properly on my Ubuntu setup right now. Fixing this is definitely possible, since I was able to do it on Kubuntu, but apparently it requires some mysterious tweaking. Would anyone be willing to help me out?

    Read the article

  • How To: Spell Check InfoPath web form in SharePoint 2010

    - by Jeremy Ramos
    Originally posted on: http://geekswithblogs.net/JeremyRamos/archive/2013/11/07/how-to-spell-check-infopath-web-form-in-sharepoint-2010.aspxThis is a sequel to my 2011 post about How To: Spell Check InfoPath Web Form in SharePoint. This time I will share how I managed to achieve Spell Checking in SharePoint 2010. This time round, we have changed our Online Forms strategy to use Custom lists instead of Form Libraries. I thought everything will be smooth sailing as we are using all OOTB features. So, we customised a Custom list form using InfoPath and added a few Rich Text Boxes (Spell Check is a requirement for this specific project). All is good in the InfoPath client including the Spell Checker so, happy days, I published straight away.Here comes the surprises now. I browsed to my Custom List and clicked Add New Item. This launched my Form in a modal dialog format. I went to my Rich Text Boxes to check the spell checker, and voila, it's disabled!I tried hacking the FormServer.aspx and the CustomSpellCheckEntirePage.js again but the new FormServer.aspx behaves differently than of MOSS 2007's. I searched for answers in many blogs to no avail. Often ending up being linked to my old blog post. I also tried placing the spell check javascript into a Content Editor Webpart of the Item's New Form and Edit form. It is launching the Spell Check dialog but it's not spellchecking the page correctly.At this point, I decided I needed to get my project across ASAP so enough with experimentations and logged a ticket with Microsoft Premier Support.On a call with the Support Engineer, I browsed through the Custom List and to the item to demonstrate my problem. Suddenly, the Spell Check tab in the toolbar is now Enabled! Surprised? Not much, it's Microsoft!Anyway, to cut my story short, here is a summary of my solution:Navigate to your Custom ListIn the Ribbon Toolbar, navigate to List > Customize List > Form Web Parts > Content Type Forms > (Item) New Form. This will display the newifs.aspx which is the page displayed when Add New Item is clicked. This page, just like any other SharePoint page, contains webparts. In this case, we have the InfoPath Form Web Part.Add a Content Editor Web Part (CEWP) on top of the InfoPath Form Web Part. (A blank CEWP would do for this example)Navigate to Page and click Stop EditingClick Add New Item again and navigate to a Rich Text box. Tadah! The Spell Check tab is now enabled!Do the same steps for the (Item) Edit Form to enable Spell Checks when editing an item.This "no code" solution discovered purely by accident!

    Read the article

  • passwordless ssh not working

    - by kuurious
    I've tried to setup a password-less ssh b/w A to B and B to A as well. Generated the public and private key using ssh-keygen -trsa on both the machines. Used the ssh-copy-id utility to copy the public-keys from A to B as well as B to A. The passwordless ssh works from A to B but not from B to A. I've checked the permissions of the ~/ssh/ folder and seems to be normal. A's .ssh folder permissions: -rw------- 1 root root 13530 2011-07-26 23:00 known_hosts -rw------- 1 root root 403 2011-07-27 00:35 id_rsa.pub -rw------- 1 root root 1675 2011-07-27 00:35 id_rsa -rw------- 1 root root 799 2011-07-27 00:37 authorized_keys drwxrwx--- 70 root root 4096 2011-07-27 00:37 .. drwx------ 2 root root 4096 2011-07-27 00:38 . B's .ssh folder permissions: -rw------- 1 root root 884 2011-07-07 13:15 known_hosts -rw-r--r-- 1 root root 396 2011-07-27 00:15 id_rsa.pub -rw------- 1 root root 1675 2011-07-27 00:15 id_rsa -rw------- 1 root root 2545 2011-07-27 00:36 authorized_keys drwxr-xr-x 8 root root 4096 2011-07-06 19:44 .. drwx------ 2 root root 4096 2011-07-27 00:15 . A is an ubuntu 10.04 (OpenSSH_5.3p1 Debian-3ubuntu4, OpenSSL 0.9.8k 25 Mar 2009) B is a debian machine (OpenSSH_5.1p1 Debian-5, OpenSSL 0.9.8g 19 Oct 2007) From A: #ssh B works fine. From B: #ssh -vvv A ... ... debug1: SSH2_MSG_SERVICE_ACCEPT received debug2: key: /root/.ssh/identity ((nil)) debug2: key: /root/.ssh/id_rsa (0x7f1581f23a50) debug2: key: /root/.ssh/id_dsa ((nil)) debug3: Wrote 64 bytes for a total of 1127 debug1: Authentications that can continue: publickey,password debug3: start over, passed a different list publickey,password debug3: preferred gssapi-keyex,gssapi-with-mic,gssapi,publickey,keyboard-interactive,password debug3: authmethod_lookup publickey debug3: remaining preferred: keyboard-interactive,password debug3: authmethod_is_enabled publickey debug1: Next authentication method: publickey debug1: Trying private key: /root/.ssh/identity debug3: no such identity: /root/.ssh/identity debug1: Offering public key: /root/.ssh/id_rsa debug3: send_pubkey_test debug2: we sent a publickey packet, wait for reply debug3: Wrote 368 bytes for a total of 1495 debug1: Authentications that can continue: publickey,password debug1: Trying private key: /root/.ssh/id_dsa debug3: no such identity: /root/.ssh/id_dsa debug2: we did not send a packet, disable method debug3: authmethod_lookup password debug3: remaining preferred: ,password debug3: authmethod_is_enabled password debug1: Next authentication method: password [email protected]'s password: Which essentially means it's not authenticating using the file /root/id_rsa. I ran the ssh-add command in both the machines as well. The authentication part of /etc/ssh/sshd_config file is # Authentication: LoginGraceTime 120 PermitRootLogin yes StrictModes yes RSAAuthentication yes PubkeyAuthentication yes #AuthorizedKeysFile %h/.ssh/authorized_keys # Don't read the user's ~/.rhosts and ~/.shosts files I'm running out of ideas. Any help would be appreciated.

    Read the article

  • BRE (Business Rules Engine) Data Services is out...!!!

    - by Vishal
    A few months ago we at Tellago had open sourced the BizTalk Data Services. We were meanwhile working on other artifacts which comes along with BizTalk Server like the “Business Rules Engine”.  We are happy to announce the first version of BRE Data Services. BRE Data Services is a same concept which we covered through BTS Data Services, providing a RESTFul OData – based API to interact with the Business Rules Engine via HTTP using ATOM Publishing Protocol or JSON as the encoding mechanism.   In the first version release, we mainly focused on the browsing, querying and searching BRE artifacts via a RESTFul interface. Also along with that we provide the functionality to execute Business Rules by inserting the Facts for policies via the IUpdatable implementation of WCF Data Services.   The BRE Data Services API provides a lightweight interface for managing Business Rules Engine artifacts such as Policies, Rules, Vocabularies, Conditions, Actions, Facts etc. The following are some examples which details some of the available features in the current version of the API.   Basic Querying: Querying BRE Policies http://localhost/BREDataServices/BREMananagementService.svc/Policies Querying BRE Rules http://localhost/BREDataServices/BREMananagementService.svc/Rules Querying BRE Vocabularies http://localhost/BREDataServices/BREMananagementService.svc/Vocabularies   Navigation: The BRE Data Services API also leverages WCF Data Services to enable navigation across related different BRE objects. Querying a specific Policy http://localhost/BREDataServices/BREMananagementService.svc/Policies(‘PolicyName’) Querying a specific Rule http://localhost/BREDataServices/BREMananagementService.svc/Rules(‘RuleName’) Querying all Rules under a Policy http://localhost/BREDataServices/BREMananagementService.svc/Policies('PolicyName')/Rules Querying all Facts under a Policy http://localhost/BREDataServices/BREMananagementService.svc/Policies('PolicyName')/Facts Querying all Actions for a specific Rule http://localhost/BREDataServices/BREMananagementService.svc/Rules('RuleName')/Actions Querying all Conditions for a specific Rule http://localhost/BREDataServices/BREMananagementService.svc/Rules('RuleName')/Actions Querying a specific Vocabulary: http://localhost/BREDataServices/BREMananagementService.svc/Vocabularies('VocabName')   Implementation: With the BRE Data Services, we also provide the functionality of executing a particular policy via HTTP. There are couple of ways you can do that though the API.   Ø First is though Service Operations feature of WCF Data Services in which you can execute the Facts by passing them in the URL itself. This is a very simple implementations of the executing the policies due to the limitations & restrictions (only primitive types of input parameters which can be passed) currently of the Service Operations of the WCF Data Services. Below is a code sample.                Below is a traced Request/Response message.                                 Ø Second is through the IUpdatable Interface of WCF Data Services. In this method, you can first query the rule which you want to execute and then inserts Facts for that particular Rules and finally when you perform the SaveChanges() call for the IUpdatable Interface API, it executes the policy with the facts which you inserted at runtime. Below is a sample of client side code. Due to the limitations of current version of WCF Data Services where there is no way you can return back the updates happening on the service side back to the client via the SaveChanges() method. Here we are executing the rule passing a serialized XML as Facts and there is no changes made to any data where we can query back to fetch the changes. This is overcome though the first way to executing the policies which is by executing it as a Service Operation call.     This actually generates a AtomPub message shown as below:   POST /Tellago.BRE.REST.ServiceHost/BREMananagementService.svc/$batch HTTP/1.1 User-Agent: Microsoft ADO.NET Data Services DataServiceVersion: 1.0;NetFx MaxDataServiceVersion: 2.0;NetFx Accept: application/atom+xml,application/xml Accept-Charset: UTF-8 Content-Type: multipart/mixed; boundary=batch_6b9a5ced-5ecb-4585-940a-9d5e704c28c7 Host: localhost:8080 Content-Length: 1481 Expect: 100-continue   --batch_6b9a5ced-5ecb-4585-940a-9d5e704c28c7 Content-Type: multipart/mixed; boundary=changeset_184a8c59-a714-4ba9-bb3d-889a88fe24bf   --changeset_184a8c59-a714-4ba9-bb3d-889a88fe24bf Content-Type: application/http Content-Transfer-Encoding: binary   MERGE http://localhost:8080/Tellago.BRE.REST.ServiceHost/BREMananagementService.svc/Facts('TestPolicy') HTTP/1.1 Content-ID: 4 Content-Type: application/atom+xml;type=entry Content-Length: 927   <?xml version="1.0" encoding="utf-8" standalone="yes"?> <entry xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" font-size: x-small"http://www.w3.org/2005/Atom">   <category scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" term="Tellago.BRE.REST.Resources.Fact" />   <title />   <author>     <name />   </author>   <updated>2011-01-31T20:09:15.0023982Z</updated>   <id>http://localhost:8080/Tellago.BRE.REST.ServiceHost/BREMananagementService.svc/Facts('TestPolicy')</id>   <content type="application/xml">     <m:properties>       <d:FactInstance>&lt;ns0:LoanStatus xmlns:ns0="http://tellago.com"&gt;&lt;Age&gt;10&lt;/Age&gt;&lt;Status&gt;true&lt;/Status&gt;&lt;/ns0:LoanStatus&gt;</d:FactInstance>       <d:FactType>TestSchema</d:FactType>       <d:ID>TestPolicy</d:ID>     </m:properties>   </content> </entry> --changeset_184a8c59-a714-4ba9-bb3d-889a88fe24bf-- --batch_6b9a5ced-5ecb-4585-940a-9d5e704c28c7—     Installation: The installation of the BRE Data Services is pretty straight forward. ·         Create a new IIS website say BREDataServices. ·         Download the SourceCode from TellagoCodeplex and copy the content from Tellago.BRE.REST.ServiceHost to the physical location of the above created website.     ·         The appPool account running the website should have admin access to the BizTalkRuleEngineDb database. ·         TheRight click the BREManagementService.svc in the IIS ContentView for the website and wala..     Conclusion: The BRE Data Services API is an experiment intended to bring the capabilities of RESTful/OData based services to the Traditional BTS/BRE Solutions. The future releases will target on technologies like BAM, ESB Toolkit. This version has been tested with various version of BizTalk Server and we have uploaded the source code to our Tellago's DevLabs workspace at Codeplex. I hope you guys enjoy this release. Keep an eye on our new releases @ Tellago Codeplex. We are working on various other Biztalk Artifacts like BAM, ESB Toolkit.     Till than happy BizzRuling…!!!     Thanks,   Vishal Mody

    Read the article

  • Launch 2010 Technical Readiness Unofficial Q&amp;A.

    - by mbcrump
    I had an email from one of my readers about the 2010 Technical Readiness Series. Please read below: Hi Michael, I noticed you blogged a while back that you were going to attend MS 2010 Launch event. I’m going to the session in Seattle on May 27, is it worth it to attend? Also, I’m wondering if they give any free software away like VS2010 Pro? Any decent vendors? Looking forward to hearing back from you. I decided this information would probably benefit several instead of just responding back to the reader. In case you are not aware, MS has a 2010 launch event showing VS2010, Office 2010, SharePoint 2010 and SQL Server 2008R2.  You can sign up for an event here: https://microsoft.crgevents.com/Register2010/Content/Event_Selection.aspx I’ve answered the questions asked below. Q: Is it worth going? A: It is if you have had little or no exposure to the latest 2010 products. Most people are familiar with VS2010, but have not seen SharePoint 2010 Office 2010 or Windows Phone 7. It is designed to get you up to speed very quickly. If you have watched most of the MIX videos and keep up with .NET in general, you will benefit more by having the ability to ask questions.    Q: Did you get any free software? A: No, only demos including: VS2010/Office 2010 – They give you a link to the url where you can download the trial version. Windows 7 Enterprise – You get a DVD with the trial version loaded. Q: Do they give away any cool swag? A: Just a Microsoft T-Shirt (XL). Q: What about the vendor selection? A: At the event that I went to, most vendors were pushing SharePoint products. There wasn’t a lot of variety in the selection. Most vendors were giving away the typical pens, buttons and stickers and trial software. If you have any other questions, feel free to contact me. I will answer and add to this un-official FAQ.

    Read the article

  • How to Split an Outlook PST File?

    MS Outlook PST Files: Personal Storage Table (PST) is a vital component of Microsoft Outlook email client. Almost all the Outlook mailbox items including mail messages, contacts, notes, calendar, jou... [Author: Pamela Broom - Computers and Internet - April 12, 2010]

    Read the article

  • Licensing issues with using code from samples coming with SDK

    - by Andrey
    Samples coming with SDK are intended to provide best practices. So logically it looks perfectly valid to take code from them. But usually samples come under licenses, for example a lot of samples from Microsoft are released under Microsoft Public License (MS-PL). Samples are usually published to provide best practices and common reusable code. But how can I use code from samples if they are under rather strict licenses?

    Read the article

  • Useful links for InfoPath form development

    - by ybbest
    Powershell http://msdn.microsoft.com/en-us/library/ms442691.aspx http://technet.microsoft.com/en-us/library/ee806878.aspx http://sharepoint.microsoft.com/blogs/zach/Lists/Posts/Post.aspx?ID=7 InfoPath http://msdn.microsoft.com/en-us/library/aa943232.aspx http://blah.winsmarts.com/2008-8-Deploying_InfoPath_2007_Forms_to_Forms_Server_-and-ndash_Properly.aspx http://msdn.microsoft.com/en-us/library/microsoft.office.infopath.server.administration.xsnfeaturereceiver.aspx http://www.articlesbase.com/web-hosting-articles/sharepoint-hosting-3-ways-to-deploy-infopath-form-templates-to-sharepoint-2605874.html http://jopx.blogspot.com/2006/08/infopath-2007-solving-xsn-can-not-be.html

    Read the article

  • SharePoint Designer 2010 Workflow Email Link To Item

    - by Brian Jackett
    In this post I’ll walk you through the process of sending an email that contains a link to the current item from a SharePoint Designer 2010 workflow.  This is a process that has been published on many other forums and blogs, but many that I have seen are more complex than seems necessary. Problem     A common request from SharePoint users is to get an email which contains a link to review/approve/edit the workflow item.  SharePoint list items contain an automatic property for Url Path, but unfortunately that Url is not properly formatted to retrieve the item if you include it directly on the message body.  I tried a few solutions suggested from other blogs or forums that took a substring of the Url Path property, concatenated the display form view Url, and mixed in some other strings.  While I was able to get this working in some scenarios I still had issues in general. Solution     My solution involved adding a hyperlink to the message body.  This ended up being far easier than I had expected and fairly intuitive once I found the correct property to use.  Follow these steps to see what I did.     First add a “Send an Email” action to your workflow.  Edit the action to pull up the email configuration dialog.  Click the “Add hyperlink” button seen below. When prompted for the address of the link click the fx button to perform a lookup.  Choose Workflow Context from the “data source” dropdown.  Choose Current Item URL from the “field from source” dropdown.  Click OK. Your Edit Hyperlink dialog should now look something like this. The end result will be a hyperlink added to your email pointing to the current workflow item.  Note: this link points to the non-modal dialog display form (display form similar to what you had in 2007). Conclusion     In this post I walked you through the steps to create a SharePoint Designer 2010 workflow with an email that contains a link to the current item.  While there are many other options for accomplishing this out on the web I found this to be a more concise process and easy to understand.  Hopefully you found this helpful as well.  Feel free to leave any comments or feedback if you’ve found other ways that were helpful to you.         -Frog Out

    Read the article

  • Ubuntu 14.04 : Lost my sound randomly tried a few commands and I think I failed

    - by Marc-Antoine Théberge
    I lost my sound the other day and I tried to delete pulseaudio then reinstall, then I tried to delete it and install alsa, It did not work and I had to reinstall everything; overall bad idea... now I can't have any sound. Should I do a fresh install? I don't know how to boot an usb drive with GRUB... Here's my sysinfo System information report, generated by Sysinfo: 2014-05-28 05:45:58 http://sourceforge.net/projects/gsysinfo SYSTEM INFORMATION Running Ubuntu Linux, the Ubuntu 14.04 (trusty) release. GNOME: 3.8.4 (Ubuntu 2014-03-17) Kernel version: 3.13.0-27-generic (#50-Ubuntu SMP Thu May 15 18:08:16 UTC 2014) GCC: 4.8 (i686-linux-gnu) Xorg: 1.15.1 (16 April 2014 01:40:08PM) (16 April 2014 01:40:08PM) Hostname: mark-laptop Uptime: 0 days 11 h 43 min CPU INFORMATION GenuineIntel, Intel(R) Atom(TM) CPU N270 @ 1.60GHz Number of CPUs: 2 CPU clock currently at 1333.000 MHz with 512 KB cache Numbering: family(6) model(28) stepping(2) Bogomips: 3192.13 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe nx constant_tsc arch_perfmon pebs bts aperfmperf pni dtes64 monitor ds_cpl est tm2 ssse3 xtpr pdcm movbe lahf_lm dtherm MEMORY INFORMATION Total memory: 2007 MB Total swap: 1953 MB STORAGE INFORMATION SCSI device - scsi0 Vendor: ATA Model: ST9160310AS HARDWARE INFORMATION MOTHERBOARD Host bridge Intel Corporation Mobile 945GSE Express Memory Controller Hub (rev 03) Subsystem: ASUSTeK Computer Inc. Device 8340 PCI bridge(s) Intel Corporation NM10/ICH7 Family PCI Express Port 1 (rev 02) (prog-if 00 [Normal decode]) Intel Corporation NM10/ICH7 Family PCI Express Port 2 (rev 02) (prog-if 00 [Normal decode]) Intel Corporation NM10/ICH7 Family PCI Express Port 4 (rev 02) (prog-if 00 [Normal decode]) Intel Corporation 82801 Mobile PCI Bridge (rev e2) (prog-if 01 [Subtractive decode]) Intel Corporation NM10/ICH7 Family PCI Express Port 1 (rev 02) (prog-if 00 [Normal decode]) Intel Corporation NM10/ICH7 Family PCI Express Port 2 (rev 02) (prog-if 00 [Normal decode]) Intel Corporation NM10/ICH7 Family PCI Express Port 4 (rev 02) (prog-if 00 [Normal decode]) Intel Corporation 82801 Mobile PCI Bridge (rev e2) (prog-if 01 [Subtractive decode]) ISA bridge Intel Corporation 82801GBM (ICH7-M) LPC Interface Bridge (rev 02) Subsystem: ASUSTeK Computer Inc. Device 830f IDE interface Intel Corporation 82801GBM/GHM (ICH7-M Family) SATA Controller [IDE mode] (rev 02) (prog-if 80 [Master]) Subsystem: ASUSTeK Computer Inc. Device 830f GRAPHIC CARD VGA controller Intel Corporation Mobile 945GSE Express Integrated Graphics Controller (rev 03) (prog-if 00 [VGA controller]) Subsystem: ASUSTeK Computer Inc. Device 8340 SOUND CARD Multimedia controller Intel Corporation NM10/ICH7 Family High Definition Audio Controller (rev 02) Subsystem: ASUSTeK Computer Inc. Device 831a NETWORK Ethernet controller Qualcomm Atheros AR8121/AR8113/AR8114 Gigabit or Fast Ethernet (rev b0) Subsystem: ASUSTeK Computer Inc. Device 8324

    Read the article

  • Sql Server Express Profiler

    - by csharp-source.net
    Sql Server Express Profiler is a profiler for MS SQL Server 2005 Express . SQL Server Express Edition Profiler provides the most of functionality standard profiler does, such as choosing events to profile, setting filters, etc. But it doesn't provide professional tools for profiling sql queries. This project is a .NET WinForms Application and in future AJAX-enabled web site which provides functionality of Microsoft SQL Profiler.

    Read the article

  • A Deep Dive into Transport Queues (Part 2)

    Johan Veldhuis completes his 'Deep Dive' by plunging even deeper into the mysteries of MS Exchange's Transport queues that are used to temporarily store messages which are waiting until they are passed through to the next stage, and explains how to change the way they work via configuration settings.

    Read the article

  • how to convert avi (xvid) to mkv or mp4 (h264)

    - by bcsteeve
    Very noob when it comes to video. I'm trying to make sense of what I"m finding via Google... but its mostly Greek to me. I have a bunch of Avi files that won't play in my WD TV Play box. Mediainfo tells me they are xvid. Specs for the box show that should be fine... but digging through forums says its hit-and-miss. So I'd like to try converting them to h264 encoded MKV or mp4 files. I gather avconv is the tool, but reading the manual just has me really really confused. I tried the very basic example of: avconv -i file.avi -c copy file.mp4 it took less than 4 seconds. And it worked... sort of. It "played" in that something came up on the screen... but there was horrible artifacting and scenes would just sort of melt into each other. I want to preserve quality if possible. I'm not concerned about file size. I'm not terribly concerned with the time it takes either, provided I can do them in a batch. Can someone familiar with the process please give me a command with the options? Thank you for your help. I'm posting the mediainfo in case it helps: General Complete name : \\SERVER\Video\Public\test.avi Format : AVI Format/Info : Audio Video Interleave File size : 189 MiB Duration : 11mn 18s Overall bit rate : 2 335 Kbps Writing application : Lavf52.32.0 Video ID : 0 Format : MPEG-4 Visual Format profile : Advanced Simple@L5 Format settings, BVOP : 2 Format settings, QPel : No Format settings, GMC : No warppoints Format settings, Matrix : Default (H.263) Muxing mode : Packed bitstream Codec ID : XVID Codec ID/Hint : XviD Duration : 11mn 18s Bit rate : 2 129 Kbps Width : 720 pixels Height : 480 pixels Display aspect ratio : 16:9 Frame rate : 29.970 fps Standard : NTSC Color space : YUV Chroma subsampling : 4:2:0 Bit depth : 8 bits Scan type : Progressive Compression mode : Lossy Bits/(Pixel*Frame) : 0.206 Stream size : 172 MiB (91%) Writing library : XviD 1.2.1 (UTC 2008-12-04) Audio ID : 1 Format : MPEG Audio Format version : Version 1 Format profile : Layer 3 Mode : Joint stereo Mode extension : MS Stereo Codec ID : 55 Codec ID/Hint : MP3 Duration : 11mn 18s Bit rate mode : Constant Bit rate : 192 Kbps Channel(s) : 2 channels Sampling rate : 48.0 KHz Compression mode : Lossy Stream size : 15.5 MiB (8%) Alignment : Aligned on interleaves Interleave, duration : 24 ms (0.72 video frame)

    Read the article

  • Data Generator Source Adapter

    This component needs little explanation. It generates random integer (DT_I4) and string (DT_WSTR) data and places them in the pipeline. You specify how many columns of each you would like and for any string columns you pass a fixed length value. You then need to specify how many rows in total you require to be generated. This component is used by us to do testing of the pipeline and components downstream. Previously we would have used a script component (as a source) to generate the rows but found ourselves rewriting the code too often so created this component. Screenshots SQL Server 2005 Integration Services SQL Server 2008/2012 Integration Services The component is provided as an MSI file, however to complete the installation, you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Data Generator Source from the list. Downloads The Data Generator Source Adapter is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Data Generator Source Adapter for SQL Server 2005 Data Generator Source Adapter for SQL Server 2008 Data Generator Source Adapter for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.30 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.29 - SQL Server 2008 February 2008 CTP. Includes support for upgrade of 2005 packages. Simplified user interface. (4 Mar 2008) Version 2.0.0.27 - SQL Server 2008 November 2007 CTP. String columns will now use the default system code page. Previously string columns always used 1252. (15 Feb 2008) SQL Server 2005 Version 1.1.0.23 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. (12 Jun 2006) Version 1.0.0.0 - SQL Server 2005 IDW 16 Sept CTP. Public release. (6 Oct 2005)

    Read the article

  • SharePoint 2010 Hosting :: How to Enable Office Web Apps on SharePoint 2010

    - by mbridge
    Office Web App is the online version of Microsoft Office 2010 which is very helpful if you are going to use SharePoint 2010 in your organization as it allows you to do basic editing of word document without installing the Office Suite in the client machine. Prerequisites : - Microsoft Server 2008 R2 - Microsoft SharePoint Server 2010 or Microsoft SharePoint Foundation 2010 - Microsoft Office Web Apps. If you have installed all the above products, just follow this steps: 1. Go to Central Administration > Click on Manage Service Application. 2. All the menus are not displayed in ribbon Menu format which was first introduced in Office 2007. Click on New > Word Viewing Services ( You can choose PowerPoint or Excel also, steps are same ). This will open a pop window. Adding Services for Office Web Apps 3. Give a Proper Name which can have your companies or project name. 4. Under Application Pool select : SharePoint Web Services Default. 5. Next keep the check box checked which says : Add this service application’s proxy to the farm’s default proxy list. Click Ok Adding Word Viewer as Service Application Office Web Apps as Services in Sharepoint 2010 6. This will install all the Office Web App services required. You can see the name as you gave in the above step. How to Activate Office Web Apps in Site Collection? 1. Go to the site for which you want to activate this feature. 2. Click on Site Action > Site Settings > Site Collection Administrator > Site Collection Features 3. Activate Office Web Apps. Activate Office Web Apps Feature in Site Collection How to make sure Office Web Apps is working for your site collection? 1. Locate any office document you have and click on the smart menu which appears when you hover your mouse on it. Dont double-click as this will launch the document in Office Client if its installed. This feature can be changed. 2. If you see View or Edit in Browser as menu item, your Office Web Apps is configured correctly. View Edit Office Document in Browser Editing Office Document in Browser Another post related SharePoint 2010: 1. How to Configure SharePoint Foundation 2010 for SharePoint Workspace 2010 2. Integrating SharePoint 2010 and SQL 2008 R2

    Read the article

  • Are Java servers really more preferable for web-development? [closed]

    - by Gerald Goward
    Many experienced people I know tell me that many web-projects, including enterprise ones, are better to develop with Java being back-end. Reasons: Ubuntu servers being cheaper and more reliable. MySql being much more "light" rather than "heavyweighted" MS Sql. I heard some more, but I really cant remember all of them. My question: I believe ASP.NET and Java are BOTH good for web-development and its just holywar subject. Am I right or not?

    Read the article

  • VS 2010 snippet manager and quickCode

    - by Michael Freidgeim
    During the last few years I've used QuickCode and it was very helpful. VS 2010 has  Code Snippets Manager that MS finally made quite convenient to use, so I will use it in a future.   I will need to convert my QuickCode commands to VS snippets. I am sure I will miss Alt-Q hotkey for some time.

    Read the article

  • Newsletter sent with drupal goes to Spam Folder [closed]

    - by HerrSerker
    Possible Duplicate: How could I prevent my mail from being recognized as spam? I'm sending a newsletter with drupals simplenews module The website is hosted on an 1und1 server in germany (as seen in in header domains online.de and kundenserver.de) When I send it, it goes to Spam folder in Yahoo & GMail Mailbox, but not in Spam Folder in web.de, hotmail and GMX Mailboxes Here is, what I have in the Mail Header (for yahoo in this example) Received: from 12.345.678.90 (EHLO sXXXXXXXXX.online.de) (12.345.678.90) by mtaXXX.mail.kks.yahoo.co.jp with SMTP; Fri, 15 Jun 2012 18:45:24 +0900 Received: from [127.0.0.1] (helo=infongdXXXXX.rtr.kundenserver.de) by sXXXXXXXXX.online.de with esmtp (Exim 4.72) (envelope-from <[email protected]>) id 1SfT5k-00068r-Q8 for [email protected]; Fri, 15 Jun 2012 11:45:20 +0200 Received: from 83.136.130.41 (IP may be forged by CGI script) by infongdXXXXX.rtr.kundenserver.de with HTTP id 0Z04SW-1SQTKp3LPr-00YxYk; Fri, 15 Jun 2012 11:45:20 +0200 From: SENDER <[email protected]> To: "[email protected]" <[email protected]> Date: Fri, 15 Jun 2012 11:45:20 +0200 Subject: This is the subject of the newsletter Thread-Topic: This is the subject of the newsletter Thread-Index: Ac1K3nT42juzo7uCSkq5dTlby1ZvpQ== List-Unsubscribe: <http://www.example.com/newsletter/confirm/remove/XXXXXXXXX> X-MS-Has-Attach: X-Auto-Response-Suppress: All X-MS-TNEF-Correlator: x-originating-ip: [12.345.678.90] authentication-results: mtaXXX.mail.kks.yahoo.co.jp from=example.com; domainkeys=neutral (no sig); dkim=neutral (no sig) [email protected] errors-to: "SENDER" <[email protected]> received-spf: none (sXXXXXXXXX.online.de: domain of [email protected] does not designate permitted sender hosts) x-apparently-to: [email protected] via 123.45.67.890; Fri, 15 Jun 2012 18:45:25 +0900 x-sender-info: <[email protected]> content-length: 13762 Content-Type: multipart/alternative; boundary="_000_7471797868716571796675707173696675806577726778666766687_" MIME-Version: 1.0 I cannot see any direct spam filter message in this. But I'm kind of stunned by the Received: from 83.136.130.41 (IP may be forged by CGI script) part. After I searched a bit, it seems, that this is a special 'feature' of 1und1 Mail servers. Here are my questions: Is it possible that, if I get rid of the 'Ip maybe forged' part, that the Mail is not regarded as spam anymore? If so, Does anyone know, how I can get rid of it in drupal?

    Read the article

  • mongoDB Management Studio

    - by Liam McLennan
    This weekend I have been in Sydney at the MS Web Camp, learning about web application development. At the end of the first day we came up with application ideas and pitched them. My idea was to build a web management application for mongoDB. mongoDB I pitched my idea, put down the microphone, and then someone asked, “what’s mongo?”. Good question. MongoDB is a document database that stores JSON style documents. This is a JSON document for a tweet from twitter: db.tweets.find()[0] { "_id" : ObjectId("4bfe4946cfbfb01420000001"), "created_at" : "Thu, 27 May 2010 10:25:46 +0000", "profile_image_url" : "http://a3.twimg.com/profile_images/600304197/Snapshot_2009-07-26_13-12-43_normal.jpg", "from_user" : "drearyclocks", "text" : "Does anyone know who has better coverage, Optus or Vodafone? Telstra is still too expensive.", "to_user_id" : null, "metadata" : { "result_type" : "recent" }, "id" : { "floatApprox" : 14825648892 }, "geo" : null, "from_user_id" : 6825770, "search_term" : "telstra", "iso_language_code" : "en", "source" : "&lt;a href=&quot;http://www.tweetdeck.com&quot; rel=&quot;nofollow&quot;&gt;TweetDeck&lt;/a&gt;" } A mongodb server can have many databases, each database has many collections (instead of tables) and a collection has many documents (instead of rows). Development Day 2 of the Sydney MS Web Camp was allocated to building our applications. First thing in the morning I identified the stories that I wanted to implement: Scenario: View databases Scenario: View Collections in a database Scenario: View Documents in a Collection Scenario: Delete a Collection Scenario: Delete a Database Scenario: Delete Documents Over the course of the day the team (3.5 developers) implemented all of the planned stories (except ‘delete a database’) and also implemented the following: Scenario: Create Database Scenario: Create Collection Lessons Learned I’m new to MongoDB and in the past I have only accessed it from Ruby (for my hare-brained scheme). When it came to implementing our MongoDB management studio we discovered that their is no official MongoDB driver for .NET. We chose to use NoRM, honestly just because it was the only one I had heard of. NoRM was a challenge. I think it is a fine library but it is focused on mapping strongly typed objects to MongoDB. For our application we had no prior knowledge of the types that would be in the MongoDB database so NoRM was probably a poor choice. Here are some screens (click to enlarge):

    Read the article

  • A Bit Cloudy

    - by Chris Massey
    "Systems Administrators, I come in peace. You have nothing to fear from me" - Office 365 Microsoft Business Productivity Online Suite recently absorbed a few other services and has been rebranded as Office 365, which is currently in private Beta and NDA-d up to the eyeballs. As Microsoft's (slightly delayed) answer to Google Apps Premier Edition, it shows a lot of promise; MS has technical expertise, market penetration, and financial capital all going for it. On the other hand, Google...(read more)

    Read the article

  • Why is Double.Parse so slow?

    - by alexhildyard
    I was recently investigating a bottleneck in one of my applications, which read a CSV file from disk using a TextReader a line at a time, split the tokens, called Double.Parse on each one, then shunted the results into an object list. I was surprised to find it was actually the Double.Parse which seemed to be taking up most of the time.Googling turned up this, which is a little unfocused in places but throws out some excellent ideas:It makes more sense to work with binary format directly, rather than coerce strings into doublesThere is a significant performance improvement in composing doubles directly from the byte stream via long intermediariesString.Split is inefficient on fixed length recordsIn fact it turned out that my problem was more insidious and also more mundane -- a simple case of bad data in, bad data out. Since I had been serialising my Doubles as strings, when I inadvertently divided by zero and produced a "NaN", this of course was serialised as well without error. And because I was reading in using Double.Parse, these "NaN" fields were also (correctly) populating real Double objects without error. The issue is that Double.Parse("NaN") is incredibly slow. In fact, it is of the order of 2000x slower than parsing a valid double. For example, the code below gave me results of 357ms to parse 1000 NaNs, versus 15ms to parse 100,000 valid doubles.            const int invalid_iterations = 1000;            const int valid_iterations = invalid_iterations * 100;            const string invalid_string = "NaN";            const string valid_string = "3.14159265";            DateTime start = DateTime.Now;                        for (int i = 0; i < invalid_iterations; i++)            {                double invalid_double = Double.Parse(invalid_string);            }            Console.WriteLine(String.Format("{0} iterations of invalid double, time taken (ms): {1}",                invalid_iterations,                ((TimeSpan)DateTime.Now.Subtract(start)).Milliseconds            ));            start = DateTime.Now;            for (int i = 0; i < valid_iterations; i++)            {                double valid_double = Double.Parse(valid_string);            }            Console.WriteLine(String.Format("{0} iterations of valid double, time taken (ms): {1}",                valid_iterations,                ((TimeSpan)DateTime.Now.Subtract(start)).Milliseconds            )); I think the moral is to look at the context -- specifically the data -- as well as the code itself. Once I had corrected my data, the performance of Double.Parse was perfectly acceptable, and while clearly it could have been improved, it was now sufficient to my needs.

    Read the article

< Previous Page | 508 509 510 511 512 513 514 515 516 517 518 519  | Next Page >