Search Results

Search found 19822 results on 793 pages for 'ms ftmg 2010'.

Page 198/793 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • web.config transforms not being applied on either publish or build installation package

    - by BenA
    Today I started playing with the web.config transforms in VS 2010. To begin with, I attempted the same hello world example that features in a lot of the blog posts on this topic - updating a connection string. I created the minimal example shown below (and similar to the one found in this blog). The problem is that whenever I do a right-click - "Publish", or a right-click - "Build Deployment Package" on the .csproj file, I'm not getting the correct output. Rather than a transformed web.config, I'm getting no web.config, and instead the two transform files are included. What am I doing wrong? Any help gratefully received! Web.config: <?xml version="1.0"?> <configuration xmlns="http://schemas.microsoft.com/.NetConfiguration/v2.0"> <connectionStrings> <add name="ConnectionString" connectionString="server=(local); initial catalog=myDB; user=xxxx;password=xxxx" providerName="System.Data.SqlClient"/> </connectionStrings> </configuration> Web.debug.config: <?xml version="1.0"?> <configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"> <connectionStrings> <add name="ConnectionString" connectionString="server=DebugServer; initial catalog=myDB; user=xxxx;password=xxxx" providerName="System.Data.SqlClient" xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/> </connectionStrings> </configuration> Web.release.config: <?xml version="1.0"?> <configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"> <connectionStrings> <add name="ConnectionString" connectionString="server=ReleaseServer; initial catalog=myDB; user=xxxx;password=xxxx" providerName="System.Data.SqlClient" xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/> </connectionStrings> </configuration>

    Read the article

  • Creating .lib files in CUDA Toolkit 5

    - by user1683586
    I am taking my first faltering steps with CUDA Toolkit 5.0 RC using VS2010. Separate compilation has me confused. I tried to set up a project as a Static Library (.lib), but when I try to build it, it does not create a device-link.obj and I don't understand why. For instance, there are 2 files: A caller function that uses a function f #include "thrust\host_vector.h" #include "thrust\device_vector.h" using namespace thrust::placeholders; extern __device__ double f(double x); struct f_func { __device__ double operator()(const double& x) const { return f(x); } }; void test(const int len, double * data, double * res) { thrust::device_vector<double> d_data(data, data + len); thrust::transform(d_data.begin(), d_data.end(), d_data.begin(), f_func()); thrust::copy(d_data.begin(),d_data.end(), res); } And a library file that defines f __device__ double f(double x) { return x+2.0; } If I set the option generate relocatable device code to No, the first file will not compile due to unresolved extern function f. If I set it to -rdc, it will compile, but does not produce a device-link.obj file and so the linker fails. If I put the definition of f into the first file and delete the second it builds successfully, but now it isn't separate compilation anymore. How can I build a static library like this with separate source files? [Updated here] I called the first caller file "caller.cu" and the second "libfn.cu". The compiler lines that VS2010 outputs (which I don't fully understand) are (for caller): nvcc.exe -ccbin "C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -G --keep-dir "Debug" -maxrregcount=0 --machine 32 --compile -g -D_MBCS -Xcompiler "/EHsc /W3 /nologo /Od /Zi /RTC1 /MDd " -o "Debug\caller.cu.obj" "G:\Test_Linking\caller.cu" -clean and the same for libfn, then: nvcc.exe -gencode=arch=compute_20,code=\"sm_20,compute_20\" --use-local-env --cl-version 2010 -ccbin "C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin" -rdc=true -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -I"C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v5.0\include" -G --keep-dir "Debug" -maxrregcount=0 --machine 32 --compile -g -D_MBCS -Xcompiler "/EHsc /W3 /nologo /Od /Zi /RTC1 /MDd " -o "Debug\caller.cu.obj" "G:\Test_Linking\caller.cu" and again for libfn.

    Read the article

  • delphi vs c# post returns different strings - utf problem?

    - by argh
    I'm posting two forms - one in c# and one in delphi. But the result string seems to be different: c# returns: ¤@@1@@@@1@@@@1@@xsm˱Â0Ð... delphi returns: #$1E'@@1@@@@1@@@@1@@x'#$009C... and sice both are compressed streams I'm getting errors while trying to decompress it... The C# is 'correct' - ie. extracts. I'm not an expert on delphi - I just need to convert some piece of code from c# to delphi. c# code: string GetData(Hashtable aParam, string ServerURL) { string Result = ""; WebRequest Request = HttpWebRequest.Create(ServerURL); Request.Method = "POST"; Request.ContentType = "application/x-www-form-urlencoded; charset=UTF-8"; UTF8Encoding encUTF8 = new System.Text.UTF8Encoding(false); StreamWriter writer = new StreamWriter(Request.GetRequestStream(), encUTF8); foreach (DictionaryEntry element in aParam) { writer.Write(element.Key + "=" + element.Value + "&"); } writer.Close(); writer.Dispose(); WebResponse Response = Request.GetResponse(); StreamReader Reader = new StreamReader(Response.GetResponseStream(), System.Text.Encoding.Default); Result = Reader.ReadToEnd(); Reader.Close(); Response.Close(); Reader.Dispose(); return Result; } delphi code: function GetData(aParam:TStringList; ServerURL:string):string; var req: TIdHTTP; res: string; begin req := TIdHTTP.Create(); with req do begin Request.ContentType := 'application/x-www-form-urlencoded; charset=UTF-8'; Request.Method := 'POST'; Request.CharSet := 'utf-8'; Request.AcceptCharSet := 'utf-8'; res := Post(ServerURL, aParam); end; Result := res; req.Free; end; -edit- I'm using delphi 2010

    Read the article

  • MySQL very slow compared to MS Access when inserting hundred of thousands of rows ! So is there a my

    - by asksuperuser
    I am currently adding hundred of thousands of rows of data to a table first on a MS Access Table then on a MySQL Table. I first tried with MS Access, it tooks less than 40 seconds. Then I tried exactly with the same source and with the same table structure to MySQL and it took 6 min and 40 seconds that is 1000% slower !!! So is there a myth that a database server is more performant ?

    Read the article

  • PHP 5.3.2 + Fcgid 2.3.5 + Apache 2.2.14 + SuExec => Connection reset by peer: mod_fcgid: error reading data from FastCGI server

    - by Zigzag
    I'm trying to use PHP 5.3.2 + Fcgid 2.3.5 + Apache 2.2.14 but I always have the error : "Connection reset by peer: mod_fcgid: error reading data from FastCGI server". And Apache returns an error 500 each time I tried to execute a php page : I have compiled the Apache with this options: ./configure --with-mpm=worker --enable-userdir=shared --enable-actions=shared --enable-alias=shared --enable-auth=shared --enable-so --enable-deflate \ --enable-cache=shared --enable-disk-cache=shared --enable-info=shared --enable-rewrite=shared \ --enable-suexec=shared --with-suexec-caller=www-data --with-suexec-userdir=site --with-suexec-logfile=/usr/local/apache2/logs/suexec.log --with-suexec-docroot=/home Then PHP: ./configure --with-config-file-path=/usr/local/apache2/php --with-apxs2=/usr/local/apache2/bin/apxs --with-mysql --with-zlib --enable-exif --with-gd --enable-cgi Then FCdigd: APXS=/usr/local/apache2/bin/apxs ./configure.apxs The VHOST is: <Directory /home/website_panel/site/> FCGIWrapper /home/website_panel/cgi/php .php ... ErrorLog /home/website_panel/logs/error.log </Directory> cat /home/website_panel/logs/error.log [Sun Mar 07 22:19:41 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:41 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:41 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:41 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:42 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:42 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:43 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:43 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php The Suexec log: root:/usr/local/apache2# cat /var/log/apache2/suexec.log [2010-03-07 22:11:05]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:11:15]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:11:23]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:41]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:41]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:42]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:43]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php root:/usr/local/apache2# cat logs/error_log [Sun Mar 07 22:18:47 2010] [notice] suEXEC mechanism enabled (wrapper: /usr/local/apache2/bin/suexec) [Sun Mar 07 22:18:47 2010] [notice] mod_bw : Memory Allocated 0 bytes (each conf takes 32 bytes) [Sun Mar 07 22:18:47 2010] [notice] mod_bw : Version 0.7 - Initialized [0 Confs] [Sun Mar 07 22:18:47 2010] [notice] Apache/2.2.14 (Unix) mod_fcgid/2.3.5 configured -- resuming normal operations root:/usr/local/apache2# /home/website_panel/cgi/php -v PHP 5.3.2 (cli) (built: Mar 7 2010 16:01:49) Copyright (c) 1997-2010 The PHP Group Zend Engine v2.3.0, Copyright (c) 1998-2010 Zend Technologies If someone has got an idea, I want to hear it ^^ Thanks !

    Read the article

  • PHP 5.3.2 + Fcgid 2.3.5 + Apache 2.2.14 + SuExec => Connection reset by peer: mod_fcgid: error readi

    - by Zigzag
    Hi, I'm trying to use PHP 5.3.2 + Fcgid 2.3.5 + Apache 2.2.14 but I always have the error : "Connection reset by peer: mod_fcgid: error reading data from FastCGI server". And Apache returns an error 500 each time I tried to execute a php page : I have compiled the Apache with this options: ./configure --with-mpm=worker --enable-userdir=shared --enable-actions=shared --enable-alias=shared --enable-auth=shared --enable-so --enable-deflate \ --enable-cache=shared --enable-disk-cache=shared --enable-info=shared --enable-rewrite=shared \ --enable-suexec=shared --with-suexec-caller=www-data --with-suexec-userdir=site --with-suexec-logfile=/usr/local/apache2/logs/suexec.log --with-suexec-docroot=/home Then PHP: ./configure --with-config-file-path=/usr/local/apache2/php --with-apxs2=/usr/local/apache2/bin/apxs --with-mysql --with-zlib --enable-exif --with-gd --enable-cgi Then FCdigd: APXS=/usr/local/apache2/bin/apxs ./configure.apxs The VHOST is: <Directory /home/website_panel/site/> FCGIWrapper /home/website_panel/cgi/php .php ... ErrorLog /home/website_panel/logs/error.log </Directory> cat /home/website_panel/logs/error.log [Sun Mar 07 22:19:41 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:41 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:41 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:41 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:42 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:42 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:43 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:43 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php The Suexec log: root:/usr/local/apache2# cat /var/log/apache2/suexec.log [2010-03-07 22:11:05]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:11:15]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:11:23]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:41]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:41]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:42]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:43]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php root:/usr/local/apache2# cat logs/error_log [Sun Mar 07 22:18:47 2010] [notice] suEXEC mechanism enabled (wrapper: /usr/local/apache2/bin/suexec) [Sun Mar 07 22:18:47 2010] [notice] mod_bw : Memory Allocated 0 bytes (each conf takes 32 bytes) [Sun Mar 07 22:18:47 2010] [notice] mod_bw : Version 0.7 - Initialized [0 Confs] [Sun Mar 07 22:18:47 2010] [notice] Apache/2.2.14 (Unix) mod_fcgid/2.3.5 configured -- resuming normal operations root:/usr/local/apache2# /home/website_panel/cgi/php -v PHP 5.3.2 (cli) (built: Mar 7 2010 16:01:49) Copyright (c) 1997-2010 The PHP Group Zend Engine v2.3.0, Copyright (c) 1998-2010 Zend Technologies If someone has got an idea, I want to hear it ^^ Thanks !

    Read the article

  • SharePoint 2010 BDC Model Deployment Issue: “The default web application could not be determined.”

    - by Jan Tielens
    Yesterday I tried to deploy a Business Data Connectivity Model project created in Visual Studio 2010 to my SharePoint 2010 test server (all RTM versions), but during the deployment of the solution, SharePoint threw my following error: Add Solution:  Adding solution 'BCSDemo2.wsp'...  Deploying solution 'BCSDemo2.wsp'...Error occurred in deployment step 'Add Solution': The default web application could not be determined. Set the SiteUrl property in feature BCSDemo2_Feature1 to the URL of the desired site and retry activation.Parameter name: properties A little bit of searching on the internet taught me that I was not the only one having this issue, actually Paul Andrew describes how to solve it in this post. Although Paul describes what to do, his explanation is not, let’s say, very elaborate. :-) So let’s describe the steps a little bit more in detail: Create a new Business Data Connectivity Model project in Visual Studio 2010 and (optionally) implement all your code, change the model etc. When you try to deploy you get the error mentioned above. To fix it, in the Solution Explorer, navigate to and open the Feature1.Template.xml file (the name could be different if you decided to give your feature a different name of course). Add the following XML in the Feature element that’s already there (replace the Value with the URL of your site of course):  <Properties>    <Property Key='SiteUrl' Value='http://spf.u2ucourse.com'/>  </Properties>The resulting XML should look like:<?xml version="1.0" encoding="utf-8" ?><Feature xmlns="http://schemas.microsoft.com/sharepoint/">  <Properties>    <Property Key='SiteUrl' Value='http://spf.u2ucourse.com'/>  </Properties></Feature> Deploy the solution, now without any issues. :-) What happens now, is that when Visual Studio creates the SharePoint Solution (the WSP file), it will use the Feature template XML to generate the Feature manifest, which will now include the missing property.

    Read the article

  • Ribbon Search: Locate MS Office Ribbon Menu Features/Functions Quickly

    - by Kavitha
    In the new versions of Microsoft Office  everything has changed with the introduction of Ribbon menus. Even though Ribbon menus has many advantages that simplifies accessing features, at times it’s a daunting task to navigate the Ribbon menus and find a specific command. Ribbon search is one of the interesting freeware tools to overcome these complaints from users, with this one can search Office ribbon for any feature or function easily. It supports both Office 2007 and  Office 2010(the versions which have ribbon). Once Installation has completed, you can find a text box on top of the ribbon in all the office applications (Outlook, Word, PowerPoint, Excel etc.). As you type few letters of the feature you are looking for, Ribbon Search instantly displays the path through which you can access the feature. Here is a screen grab search of Ribbon Search in action When you start typing itself shows results instantly. And also it gives the path through which you can access feature you are searching for. If there are multiple ways to access the feature, it is also shown in the list. Download Ribbon Search This article titled,Ribbon Search: Locate MS Office Ribbon Menu Features/Functions Quickly, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Few questions on MS duo

    - by user23950
    I constantly delete and add data on my memory stick duo. Because I watch movies on PSP. Here are my questions: Does constantly deleting and adding movies on ms duo degrades its performance? Does constantly deleting and adding movies on ms duo makes its life span shorter? How many deletes and adds can a sandisk ms duo sustain until it gets destroyed?

    Read the article

  • MacBook Pro 10.6 losing dns service, network connection still functional if you know the ip address.

    - by Vincent
    MacBook pro connected to a wireless network (not sure about wired) I lose DNS. I still have a functioning connection and as long as I know the ip address of the website, server... for example skype works, ssh name@ipaddress, .... Things can be working properly and then just quit, Once I was im via skype and lost dns skype continued to work. This has happened in multiple locations on private and public networks. What does not work/fix it: Resetting router changing dns server on computer or router connecting to another network removing the airport interface and adding it back flushing dns The only solution seems to be a restart. A solution to this would be great, but any ideas of this to try would be great. Even a sure way to reproduce this would be useful. Maybe related question: But this is most definitely not true for me. "if I refresh enough -- 3 to 4 times --, it will usually pull up the site. " Here are some tests from terminal. Basically this confirms dns in not functioning vmd17:~ vmd$ ping google.com ping: cannot resolve google.com: Unknown host Trace route to google dns, This works vmd17:~ vmd$ /usr/sbin/traceroute -n -w 2 -q 2 -m 30 8.8.8.8 traceroute to 8.8.8.8 (8.8.8.8), 30 hops max, 52 byte packets 1 192.168.1.1 5.195 ms 2.519 ms 2 67.172.136.1 31.881 ms 9.177 ms 3 68.85.107.121 12.168 ms 10.003 ms 4 68.86.103.41 12.021 ms 9.594 ms 5 68.86.91.1 16.712 ms 12.837 ms 6 68.86.86.210 29.951 ms 25.826 ms 7 68.86.87.218 29.554 ms 42.894 ms 8 75.149.231.70 68.271 ms 68.362 ms 9 72.14.233.77 141.178 ms 72.14.233.85 82.553 ms 10 72.14.238.243 83.381 ms 82.811 ms 11 72.14.232.213 194.387 ms 72.14.232.215 84.837 ms 12 209.85.253.145 100.294 ms * 13 8.8.8.8 101.689 ms 89.694 ms 208.67.222.22 is the ip address of opendns dns server vmd17:~ vmd$ dig @208.67.222.222 8.8.8.8 ; <<>> DiG 9.6.0-APPLE-P2 <<>> @208.67.222.222 8.8.8.8 ; (1 server found) ;; global options: +cmd ;; connection timed out; no servers could be reached vmd17:~ vmd$ dig @208.67.222.222 gogle.com vmd17:~ vmd$ dig @208.67.222.222 google.com ; <<>> DiG 9.6.0-APPLE-P2 <<>> @208.67.222.222 google.com ; (1 server found) ;; global options: +cmd ;; connection timed out; no servers could be reached vmd17:~ vmd$ dig @8.8.8.8 google.com ; <<>> DiG 9.6.0-APPLE-P2 <<>> @8.8.8.8 google.com ; (1 server found) ;; global options: +cmd ;; connection timed out; no servers could be reached

    Read the article

  • SQL query. An unusual join. DB implemented in sqlite-3

    - by user02814
    This is essentially a question about constructing an SQL query. The db is implemented with sqlite3. I am a relatively new user of SQL. I have two tables and want to join them in an unusual way. The following is an example to explain the problem. Table 1 (t1): id year name ------------------------- 297 2010 Charles 298 2011 David 300 2010 Peter 301 2011 Richard Table 2 (t2) id year food --------------------------- 296 2009 Bananas 296 2011 Bananas 297 2009 Melon 297 2010 Coffee 297 2012 Cheese 298 2007 Sugar 298 2008 Cereal 298 2012 Chocolate 299 2000 Peas 300 2007 Barley 300 2011 Beans 300 2012 Chickpeas 301 2010 Watermelon I want to join the tables on id and year. The catch is that (1) id must match exactly, but if there is no exact match in Table 2 for the year in Table 1, then I want to choose the year that is the next (lower) available. A selection of the kind that I want to produce would give the following result id year matchyr name food ------------------------------------------------- 297 2010 2010 Charles Coffee 298 2011 2008 David Cereal 300 2010 2007 Peter Barley 301 2011 2010 Richard Watermelon To summarise, id=297 had an exact match for year=2010 given in Table 1, so the corresponding line for id=297, year=2010 is chosen from Table 2. id=298, year=2011 did not have a matching year in Table 2, so the next available year (less than 2011) is chosen. As you can see, I would also like to know what that matched year (whether exactly , or inexactly) actually was. I would very much appreciate (1) an indication (yes/no answer) of whether this is possible to do in SQL alone, or whether I need to look outside SQL, and (2) a solution, if that is not too onerous.

    Read the article

  • MegaCli newly created disk doesn't appear under /dev/sdX

    - by Henry-Nicolas Tourneur
    After having successfully added 2 new disks in a new RAID virtual drive (background initialization done), I would have exepected it to appear under /dev/sdh but it's not there (so, unusable). The system is running a CentOS 5.2 64 bits, HAL and udev daemons are running, not records of any sdh apparition under the messsage log file or in dmesg, only MegaCli do see that virtual drive. Any idea ? Some data: [root@server ~]# ./MegaCli -LDInfo -LALL -a0 Adapter 0 -- Virtual Drive Information: Virtual Disk: 0 (target id: 0) Name: RAID Level: Primary-1, Secondary-0, RAID Level Qualifier-0 Size:139392MB State: Optimal Stripe Size: 64kB Number Of Drives:2 Span Depth:1 Default Cache Policy: WriteBack, ReadAheadNone, Direct, No Write Cache if Bad BBU Current Cache Policy: WriteBack, ReadAheadNone, Direct, No Write Cache if Bad BBU Access Policy: Read/Write Disk Cache Policy: Disk's Default Virtual Disk: 1 (target id: 1) Name: RAID Level: Primary-1, Secondary-0, RAID Level Qualifier-0 Size:285568MB State: Optimal Stripe Size: 64kB Number Of Drives:2 Span Depth:1 Default Cache Policy: WriteBack, ReadAheadNone, Direct, No Write Cache if Bad BBU Current Cache Policy: WriteBack, ReadAheadNone, Direct, No Write Cache if Bad BBU Access Policy: Read/Write Disk Cache Policy: Disk's Default [root@server ~]# ls -l /dev/disk/by-id/scsi-360* lrwxrwxrwx 1 root root 9 Nov 17 2010 /dev/disk/by-id/scsi-36001ec90f82fe100108ca0a704098d09 -> ../../sda lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36001ec90f82fe100108ca0a704098d09-part1 -> ../../sda1 lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36001ec90f82fe100108ca0a704098d09-part2 -> ../../sda2 lrwxrwxrwx 1 root root 9 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fe07e78f94940c0000a0ee -> ../../sdf lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fe07e78f94940c0000a0ee-part1 -> ../../sdf1 lrwxrwxrwx 1 root root 9 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fe972a3f91240a0000005f -> ../../sdb lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fe972a3f91240a0000005f-part1 -> ../../sdb1 lrwxrwxrwx 1 root root 9 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fea7e18f94640c000020ec -> ../../sde lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fea7e18f94640c000020ec-part1 -> ../../sde1 lrwxrwxrwx 1 root root 9 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0feb7da8f94340c0000203d -> ../../sdd lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0feb7da8f94340c0000203d-part1 -> ../../sdd1 lrwxrwxrwx 1 root root 9 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fed7d78f94040c000080b7 -> ../../sdc lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36090a028e0fed7d78f94040c000080b7-part1 -> ../../sdc1 lrwxrwxrwx 1 root root 9 Nov 17 2010 /dev/disk/by-id/scsi-36090a05830145e58e0b9c479000010a1 -> ../../sdg lrwxrwxrwx 1 root root 10 Nov 17 2010 /dev/disk/by-id/scsi-36090a05830145e58e0b9c479000010a1-part1 -> ../../sdg1

    Read the article

  • Can font substitutions in MS Word be controlled?

    - by Jukka K. Korpela
    Suppose that I am typing text in MS Word (any version) and I enter a character that does not exist in the font being used. Say, I’m using Times New Roman and I type 2300 Alt X, which turns to the diameter sign “?”, which does not exist in Times New Roman. MS Word picks it up from a different font, like Arial Unicode MS. This may mess up the typographic style, or line spacing. And this happens without notice. Perhaps the most inconvenient feature here is that MS Word does not automatically return to the original font. Subsequent text appears in the replacement font, unless the user sees what is happening and realizes that he needs to change the font. The question is: Can such substitutions be controlled, e.g. by specifying the font(s) to be used as backup fonts? If not, is there any reliable documentation about it?

    Read the article

  • WordPress mod_rewrite redirect specific folders

    - by Ps Cjef
    As a new user, I'm not allowed to post more than two hyperlinks here. So I have added a space after every http (ignore them and read as full URLs). System: Debian Etch, Apache 2.2 I have a WordPress instance with multiple blogs. I would like to redirect some of the folders based on the year and month, while leaving other folders go to the actual locations. Example: I have archives for a few years, like 2010, 2011 and 2012: http ://mydomain.com/wordpress/myblog/2010/02 http ://mydomain.com/wordpress/myblog/2011/01 http ://mydomain.com/wordpress/myblog/2012/01 I would like to redirect all 2010 and 2011 posts to another blog with the same folder structure: http ://mydomain.com/wordpress/myotherblog/2010/02 http ://mydomain.com/wordpress/myotherblog/2011/01 and so on. I would like to have 2012 and beyond to go to the actual site (http ://mydomain.com/wordpress/myblog/2012/01). I tried mod_rewrite with the following, one rule at a time to test redirection for just one year (and to expand later for other years), and none of them worked! * RewriteEngine is already on since there are some default WordPress rewrites. * RewriteBase is set to http://mydomain.com/wordpress/ . * I put my rule before all the other default WordPress rules are processed. Didn't work solution #1 RedirectMatch 301 /myblog/2010/(.*) /myotherblog/2010/$1 Didn't work solution #2 RewriteRule /myblog/2010/(.*) http ://mydomain.com/myotherblog/2010/$1 [R=301] Didn't work solution #3 RedirectPermanent /myblog/2010/(.*) http ://mydomain.com/myotherblog/2010/$1 I've also tried the above rules with and without a fully qualified URL for the new location. The rewrite log, with log level set to 9, did not provide any useful information. It shows that it looks at the pattern specified against the URL (as mentioned in the rule), but finally what happens is a passthrough to http ://mydomain.com/myblog/ for all URLs or a 500 Internal Server Error. Any ideas on where I could be going wrong or any alternative solutions?

    Read the article

  • How to copy web page text and images to MS Word

    - by Les
    From time to time I want to copy and paste a portion of a web document (viewed in both IE Explorer 7 and 8) into MS Word 2007. The selected text copies and pastes fine, but I am left with only place holders for the images (png). Right clicking the image and clicking copy, then pasting into MS Word doesn't work either. If I paste the image into MS Paint and copy it from there, I can paste it into the Word document. What gives?

    Read the article

  • Hosting company that does Linux VPS and MS SQL

    - by danielmcq
    I'm looking for hosted solutions but there are so many companies that finding the right one using a Google search is a bit overwhelming. Ideally I would like a hosting company which has following options: -Linux VPSs - Individual VPSs should be fairly cheap since I plan on putting one or two services per VPS i.e web server on one (httpd and ColdFusion), an SVN server on another, etc. -Managed MS SQL databases - My company already has data in MS SQL databases and a lot of ColdFusion code written that has MS SQL specific commands in it. -Individually purchased dedicated IP addresses -Preferably located in the North America region My plan would be to setup one Linux VPS as a gateway/firewall/VPN server and have all of my traffic routed through so that my other servers would not use of bandwidth by talking to each other. The trick is also finding a company that does Linux VPS AND MS SQL databases. Does anybody know of any hosting companies offer what I'm looking for? Let me know if I need to add more details.

    Read the article

  • Best practice to use MS Excel as a database

    - by Ying
    In office, it is popular to use MS Excel to store data. In most cases, the data is structured, which means it is suitable for a database. I know peole prefer MS Excel for it is easy to change the data structure and data value. So I have an idea to use MS Excel as a database IF people follow a general rule to store data. In other words, by a best practice to use MS Excel as a database. I have thought to use MS Access to store data, but it is expensive and not popular as MS Excel. I don't mind to buy such a solution, especially when it is for .Net platform. Any ideas or suggestions are welcome.

    Read the article

  • SQL IO and SAN troubles

    - by James
    We are running two servers with identical software setup but different hardware. The first one is a VM on VMWare on a normal tower server with dual core xeons, 16 GB RAM and a 7200 RPM drive. The second one is a VM on XenServer on a powerful brand new rack server, with 4 core xeons and shared storage. We are running Dynamics AX 2012 and SQL Server 2008 R2. When I insert 15 000 records into a table on the slow tower server (as a test), it does so in 13 seconds. On the fast server it takes 33 seconds. I re-ran these tests several times with the same results. I have a feeling it is some sort of IO bottleneck, so I ran SQLIO on both. Here are the results for the slow tower server: C:\Program Files (x86)\SQLIO>test.bat C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -frandom -b8 -BH -LS C:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads writing for 120 secs to file C:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 226.97 MBs/sec: 1.77 latency metrics: Min_Latency(ms): 0 Avg_Latency(ms): 281 Max_Latency(ms): 467 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 99 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -frandom -b8 -BH -LS C:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads reading for 120 secs from file C:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 91.34 MBs/sec: 0.71 latency metrics: Min_Latency(ms): 14 Avg_Latency(ms): 699 Max_Latency(ms): 1124 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -fsequential -b64 -BH -LS C :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads writing for 120 secs to file C:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1094.50 MBs/sec: 68.40 latency metrics: Min_Latency(ms): 0 Avg_Latency(ms): 58 Max_Latency(ms): 467 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -fsequential -b64 -BH -LS C :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 14318180 counts per second 8 threads reading for 120 secs from file C:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: C:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1155.31 MBs/sec: 72.20 latency metrics: Min_Latency(ms): 17 Avg_Latency(ms): 55 Max_Latency(ms): 205 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 Here are the results of the fast rack server: C:\Program Files (x86)\SQLIO>test.bat C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -frandom -b8 -BH -LS E:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file E:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for write): The system cannot find the pa th specified. exiting C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -frandom -b8 -BH -LS E:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file E:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for read): The system cannot find the pat h specified. exiting C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -fsequential -b64 -BH -LS E :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file E:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for write): The system cannot find the pa th specified. exiting C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -fsequential -b64 -BH -LS E :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file E:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) open_file: CreateFile (E:\TestFile.dat for read): The system cannot find the pat h specified. exiting C:\Program Files (x86)\SQLIO>test.bat C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -frandom -b8 -BH -LS c:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file c:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 2575.77 MBs/sec: 20.12 latency metrics: Min_Latency(ms): 1 Avg_Latency(ms): 24 Max_Latency(ms): 655 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 5 8 9 9 9 8 5 3 1 1 1 1 0 0 0 0 0 0 0 0 0 37 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -frandom -b8 -BH -LS c:\Tes tFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file c:\TestFile.dat using 8KB random IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1141.39 MBs/sec: 8.91 latency metrics: Min_Latency(ms): 1 Avg_Latency(ms): 55 Max_Latency(ms): 652 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 91 C:\Program Files (x86)\SQLIO>sqlio -kW -t8 -s120 -o8 -fsequential -b64 -BH -LS c :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads writing for 120 secs to file c:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 341.37 MBs/sec: 21.33 latency metrics: Min_Latency(ms): 5 Avg_Latency(ms): 186 Max_Latency(ms): 120037 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 C:\Program Files (x86)\SQLIO>sqlio -kR -t8 -s120 -o8 -fsequential -b64 -BH -LS c :\TestFile.dat sqlio v1.5.SG using system counter for latency timings, 62500000 counts per second 8 threads reading for 120 secs from file c:\TestFile.dat using 64KB sequential IOs enabling multiple I/Os per thread with 8 outstanding buffering set to use hardware disk cache (but not file cache) using current size: 5120 MB for file: c:\TestFile.dat initialization done CUMULATIVE DATA: throughput metrics: IOs/sec: 1024.07 MBs/sec: 64.00 latency metrics: Min_Latency(ms): 5 Avg_Latency(ms): 61 Max_Latency(ms): 81632 histogram: ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+ %: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100 Three of the four tests are, to my mind, within reasonable parameters for the rack server. However, the 64 write test is incredibly slow on the rack server. (68 mb/sec on the slow tower vs 21 mb/s on the rack). The read speed for 64k also seems slow. Is this enough to say there is some sort of bottleneck with the shared storage? I need to know if I can take this evidence and say we need to launch an investigation into this. Any help is appreciated.

    Read the article

  • VS2010 Assembly Load Error

    - by Nate
    I am getting the following error when I try to build an ASP.NET 4 project in Visual Studio 2010: "Could not load file or assembly 'file:///C:\Dev\project\trunk\bin\Elmah.dll' or one of its dependencies. Operation is not supported. (Exception from HRESULT: 0x80131515)". I have verified that the dll does, in fact, exist, and is getting copied to the bin folder correctly. I have also tried removing and then re-adding the reference to the project. The build only fails when I switch the Solution Configuration to "Release". It does not fail when the Solution Configuration is set to "Debug". The only difference between the two configurations (that I know of) is shown in the following Web.config transform, Web.Release.config: <?xml version="1.0"?> <configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"> <connectionStrings> <add name="SqlServer" connectionString="" providerName="System.Data.SqlClient" xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/> </connectionStrings> <system.web> <compilation xdt:Transform="RemoveAttributes(debug)" /> <customErrors mode="On" xdt:Transform="Replace"> <error statusCode="404" redirect="lost.htm" /> <error statusCode="500" redirect="uhoh.htm" /> </customErrors> </system.web> </configuration> I have tried using Fusion Log Viewer to track down the assembly binding issue, but it looks like it is finding and loading the assembly correctly. Here is the log: *** Assembly Binder Log Entry (6/8/2010 @ 10:01:54 AM) *** The operation was successful. Bind result: hr = 0x0. The operation completed successfully. Assembly manager loaded from: C:\Windows\Microsoft.NET\Framework\v4.0.30319\clr.dll Running under executable c:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\sgen.exe --- A detailed error log follows. === Pre-bind state information === LOG: User = User LOG: Where-ref bind. Location = C:\Dev\project\trunk\bin\Elmah.dll LOG: Appbase = file:///c:/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/ LOG: Initial PrivatePath = NULL LOG: Dynamic Base = NULL LOG: Cache Base = NULL LOG: AppName = sgen.exe Calling assembly : (Unknown). === LOG: This bind starts in LoadFrom load context. WRN: Native image will not be probed in LoadFrom context. Native image will only be probed in default load context, like with Assembly.Load(). LOG: No application configuration file found. LOG: Using host configuration file: LOG: Using machine configuration file from C:\Windows\Microsoft.NET\Framework\v4.0.30319\config\machine.config. LOG: Attempting download of new URL file:///C:/Dev/project/trunk/bin/Elmah.dll. LOG: Assembly download was successful. Attempting setup of file: C:\Dev\project\trunk\bin\Elmah.dll LOG: Entering run-from-source setup phase. LOG: Assembly Name is: Elmah, Version=1.1.11517.0, Culture=neutral, PublicKeyToken=null LOG: Re-apply policy for where-ref bind. LOG: Where-ref bind Codebase does not match what is found in default context. Keep the result in LoadFrom context. LOG: Binding succeeds. Returns assembly from C:\Dev\project\trunk\bin\Elmah.dll. LOG: Assembly is loaded in LoadFrom load context. I feel like there is a fundamental lack of understanding on my part as to what exactly is going on here. Any explanation/help is much appreciated!

    Read the article

  • Delphi: Using Enumerators to filter TList<T: class> by class type?

    - by afrazier
    Okay, this might be confusing. What I'm trying to do is use an enumerator to only return certain items in a generic list based on class type. Given the following hierarchy: type TShapeClass = class of TShape; TShape = class(TObject) private FId: Integer; public function ToString: string; override; property Id: Integer read FId write FId; end; TCircle = class(TShape) private FDiameter: Integer; public property Diameter: Integer read FDiameter write FDiameter; end; TSquare = class(TShape) private FSideLength: Integer; public property SideLength: Integer read FSideLength write FSideLength; end; TShapeList = class(TObjectList<TShape>) end; How can I extend TShapeList such that I can do something similar to the following: procedure Foo; var ShapeList: TShapeList; Shape: TShape; Circle: TCircle; Square: TSquare; begin // Create ShapeList and fill with TCircles and TSquares for Circle in ShapeList<TCircle> do begin // do something with each TCircle in ShapeList end; for Square in ShapeList<TSquare> do begin // do something with each TSquare in ShapeList end; for Shape in ShapeList<TShape> do begin // do something with every object in TShapeList end; end; I've tried extending TShapeList using an adapted version of Primoz Gabrijelcic's bit on Parameterized Enumerators using a factory record as follows: type TShapeList = class(TObjectList<TShape>) public type TShapeFilterEnumerator<T: TShape> = record private FShapeList: TShapeList; FClass: TShapeClass; FIndex: Integer; function GetCurrent: T; public constructor Create(ShapeList: TShapeList); function MoveNext: Boolean; property Current: T read GetCurrent; end; TShapeFilterFactory<T: TShape> = record private FShapeList: TShapeList; public constructor Create(ShapeList: TShapeList); function GetEnumerator: TShapeFilterEnumerator<T>; end; function FilteredEnumerator<T: TShape>: TShapeFilterFactory<T>; end; Then I modified Foo to be: procedure Foo; var ShapeList: TShapeList; Shape: TShape; Circle: TCircle; Square: TSquare; begin // Create ShapeList and fill with TCircles and TSquares for Circle in ShapeList.FilteredEnumerator<TCircle> do begin // do something with each TCircle in ShapeList end; for Square in ShapeList.FilteredEnumerator<TSquare> do begin // do something with each TSquare in ShapeList end; for Shape in ShapeList.FilteredEnumerator<TShape> do begin // do something with every object in TShapeList end; end; However, Delphi 2010 is throwing an error when I try to compile Foo about Incompatible types: TCircle and TShape. If I comment out the TCircle loop, then I get a similar error about TSquare. If I comment the TSquare loop out as well, the code compiles and works. Well, it works in the sense that it enumerates every object since they all descend from TShape. The strange thing is that the line number that the compiler indicates is 2 lines beyond the end of my file. In my demo project, it indicated line 177, but there's only 175 lines. Is there any way to make this work? I'd like to be able to assign to Circle directly without going through any typecasts or checking in my for loop itself.

    Read the article

  • Delphi - Is there a better way to get state abbreviations from state names

    - by Bill
    const states : array [0..49,0..1] of string = ( ('Alabama','AL'), ('Montana','MT'), ('Alaska','AK'), ('Nebraska','NE'), ('Arizona','AZ'), ('Nevada','NV'), ('Arkansas','AR'), ('New Hampshire','NH'), ('California','CA'), ('New Jersey','NJ'), ('Colorado','CO'), ('New Mexico','NM'), ('Connecticut','CT'), ('New York','NY'), ('Delaware','DE'), ('North Carolina','NC'), ('Florida','FL'), ('North Dakota','ND'), ('Georgia','GA'), ('Ohio','OH'), ('Hawaii','HI'), ('Oklahoma','OK'), ('Idaho','ID'), ('Oregon','OR'), ('Illinois','IL'), ('Pennsylvania','PA'), ('Indiana','IN'), ('Rhode Island','RI'), ('Iowa','IA'), ('South Carolin','SC'), ('Kansas','KS'), ('South Dakota','SD'), ('Kentucky','KY'), ('Tennessee','TN'), ('Louisiana','LA'), ('Texas','TX'), ('Maine','ME'), ('Utah','UT'), ('Maryland','MD'), ('Vermont','VT'), ('Massachusetts','MA'), ('Virginia','VA'), ('Michigan','MI'), ('Washington','WA'), ('Minnesota','MN'), ('West Virginia','WV'), ('Mississippi','MS'), ('Wisconsin','WI'), ('Missouri','MO'), ('Wyoming','WY') ); function getabb(state:string):string; var I:integer; begin for I := 0 to length(states) -1 do if lowercase(state) = lowercase(states[I,0]) then begin result:= states[I,1]; end; end; function getstate(state:string):string; var I:integer; begin for I := 0 to length(states) -1 do if lowercase(state) = lowercase(states[I,1]) then begin result:= states[I,0]; end; end; procedure TForm2.Button1Click(Sender: TObject); begin edit1.Text:=getabb(edit1.Text); end; procedure TForm2.Button2Click(Sender: TObject); begin edit1.Text:=getstate(edit1.Text); end; end. Is there a bette way to do this?

    Read the article

  • Asp.net mvc application deployment / security issues

    - by WestDiscGolf
    I'll start with appologies; I wasn't sure if this was best posted here of Server Fault so if its in the wrong place then please move :-) Basic information I have written the first module of a new application at work. This is written using Visual Studio 2010, targetting .net 3.5 (at the moment) and asp.net mvc 2. This has been working fine during development running on the built in Development server from VS but however does not work once deployed to IIS 7/7.5. To deploy the application, I have built it in release mode and created a deployment package by right clicking on the project in the solution explorer (this will be done with an automated build in tfs once upgrade from the beta). This has then been imported into IIS on the server. The application is using windows/domain authentication. Issue #1 I can fire up internet explorer and browse to the application from a client computer as well as on a remote desktop connection. I can execute the code which reads/stores data in Session fine from the IE instance on the remote desktop but if I browse to it from the client pc it seems to lose the session state. I click on the form submit and the page refreshes and doesn't execute the required code. I've tried setting with; InProc, SQLServer and StateServer. but with no luck :-( Issue #2 As part of the application it views PDF and Tiff documents on the fly which are on a network share on the office network and creates thumbnails if the document hasn't been viewed before. This works if running on the machine the application is deployed to; however when browsing from a client pc I get an error saying: Access to the path '\\fileserver\folder\file.tif' is denied Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.UnauthorizedAccessException: Access to the path '\\fileserver\folder\file.TIF' is denied. ASP.NET is not authorized to access the requested resource. Consider granting access rights to the resource to the ASP.NET request identity. ASP.NET has a base process identity (typically {MACHINE}\ASPNET on IIS 5 or Network Service on IIS 6) that is used if the application is not impersonating. If the application is impersonating via , the identity will be the anonymous user (typically IUSR_MACHINENAME) or the authenticated request user. As this is on a different server the user is not accessible. To get round this I have tried: 1 - setting the application pool to run as domain administrator (I know this is a security risk, but I'm just trying to get it to work at the moment!) 2 - to set the log on account for World Wide Web Publishing service to be the domain admin . When trying to restart the service I get ... Windows could not start the World Wide Web Publishing Service service on the Local Computer. Error 1079: The account specified for this service is different from the account specified fro the other services running in the same process. Any pointers/help would be much appriciated as I'm pulling my hair out (of what little I have left). Update I've been using this funky little tool I found - DelegConfig v2 beta (Delegation / Kerberos Configuration Tool). This has been really usefull. So I've got the accessing of the file share working (there is a test page which will read the files) so now I've just got the issue of passing through the users credentials through to the SQL Server (wans't my choice to do it this way!!) to execute the queries etc. but I can't get it to log on as the user. It tries to access it as "NT Authority\Network Service" which doesn't have a sql login (as should be the logged on user). My connection string is: <add name="User" connectionString="Data Source=.;Integrated Security=True" providerName="System.Data.SqlClient" /> No initial catalog is specified as the system is over multiple dbs (also wasn't my choice!!). I really appriciate all the help so far! :-) Any further hints?!

    Read the article

  • VS2010: "Select Resource" dialog & resx location

    - by Dav
    Got two issues with the VS2010 / VS2008 select resource dialog - the one that appears when you want to add an image to a button in a WinForms app for example. Give me my files back! It only seems to see the default project resources file (Properties\Resources.resx), and resx files in project root (say MyProject\famfamfam.resx). We have quite a few icons all over the app, and because some of them come from different icon sets (like famfamfam), and some are related to this project only we'd like to keep them separate. For that same reason (keeping solution neat & tidy) we want to store these extra resource files in the Resources folder (eg. Resources\famfamfam.resx). However, we'd also like to keep using the Select Resource dialog :-) Because it does not see the 'extra' resource files, we're having to select a 'fake' icon now (from the global Resources.resx file) and then manually change that to reference the right icon in .Designer.cs. As you can imagine, this is a pain. Stop modifying my files! Second issue is a bit more annoying. We use the excellent MultiLang add-in for Visual Studio to globalize our app. It stores its translations in MultiLang.resx & MultiLang.XY.resx files in the project root, where XY is a language code, eg. .cs.resx for Czech. These have to be set to No code generation access modifier. What Select Resource seems to be doing is set all .resx files it can find to Internal. Exec summary Is there a way to convince Select Resource dialog to look for extra .resx files anywhere besides the project root? Is there any way to stop it from modifying the access modifier of the resources it does see (other than file a bug with MS)? Thanks in advance for any suggestions!

    Read the article

  • SQL Server schema comparison - leaves behind obsolete files on synchronization

    - by b0x0rz
    Directly related to http://stackoverflow.com/questions/2768489/visual-studio-2010-database-project-is-there-a-visual-way/2772205#2772205 I have a problem synchronizing between database project and a database in Visual Studio. Usually I synchronize FROM database TO database project (using the Visual Studio data scheme compare new schema comparison). The synchronization works, BUT when I for example corrected the spelling of a key in the database and synchronized it - the file with the WRONG spelling of a key remains (albeit is commented out inside). the new one is correctly added. This file happens to be in: [project name]/Scheme Objects/Schemas/dbo/Tables/Keys But for sure there are others elsewhere. how to automatically remove such obsolete files when synchronizing? thnx

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >