Search Results

Search found 22255 results on 891 pages for 'www ruu cc'.

Page 8/891 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • applying rules to CC'd messages in Outlook 2007

    - by Danny Chia
    This is probably a silly question, but here goes: I have two e-mail aliases that forward messages to my main address. I'm trying to create a rule to move all messages that I receive to a specific folder. There is a condition that applies to messages "where my name is in the To or Cc box," but it doesn't let me specify what "my name" is. Not surprisingly, it only affects messages that have not been sent to an alias. So far, I found a solution as follows: I select the condition that applies to messages with specific words in the recipient's address, and I enter my address and aliases as those "words." It's kind of an awkward hack, but it works. Normally, this wouldn't be much of an issue, but I have a "family computer" that is shared among my parents and myself, and I don't want their e-mails and mine to be jumbled together in the Inbox. So my questions are: Is there a solution that is less awkward than the one I used? Alternatively, is there a way to assign multiple e-mail addresses (or aliases) to one account? Thanks!

    Read the article

  • How to auto-cc a system email account any time a user creates an appointment

    - by Ferdy
    I will not bother explaining my full architecture or reasons for wanting this in order to keep this question short: Is it possible to auto-cc a certain email account any time a Exchange user creates an appointment or meeting in his own calendar? Is it possible using rules? Our Exchange 2007 server is outsourced, I cannot change the configuration or install plugins server-side Preferably, it still should work server-side, because users may use the Outlook client but also Outlook Web Access Is there any other way, perhaps using group policies? My conclusion so far is that the only viable way to accomplish this is to build an Outlook add-on. The problem there is that it will need to be managed for thousands of desktop users and that the add-on will not work when using another client (OWA, mobile). An alternative architecture could be to pull the information from the user's calendar on a scheduled basis. Given that we are talking about a lot of users, scalability is a major issue, this has also been confirmed by Microsoft. Can you confirm that my thinking is correct or do you have any other solutions?

    Read the article

  • edited and reversed changes on .htaccess - site starts redirecting to .comindex.php/

    - by Aurigae
    Site is a Joomla 2.5 site. I wanted to add a non www to www redirect to the htaccess file, did so, then the redirection went mad, reversed but still the site redirects. When i click view site in admin panel, i get linked to http://domain.comindex.php/ The website is http://www.domain.com Visiting the website URL works without www, but once you click on projects it acts mad too. Projects is managed with joomshopping extension. EDIT: the redirect also happens when rewrite is deactivated in admin panel. ## # @package Joomla # @copyright Copyright (C) 2005 - 2012 Open Source Matters. All rights reserved. # @license GNU General Public License version 2 or later; see LICENSE.txt ## ## # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. ## ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks ## Mod_rewrite in use. RewriteEngine On ## Begin - Rewrite rules to block out some common exploits. # If you experience problems on your site block out the operations listed below # This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR] # Block out any script that includes a <script> tag in URL. RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL. RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL. RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Return 403 Forbidden header and show the content of the root homepage RewriteRule .* index.php [F] # ## End - Rewrite rules to block out some common exploits. ## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ## End - Custom redirects ## # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root). ## # RewriteBase / ## Begin - Joomla! core SEF Section. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for something within the component folder, # or for the site root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ## End - Joomla! core SEF Section. Redirect 301 /index.html /index.php Redirect 301 /services /project Redirect 301 /projects/projects.html /project Redirect 301 /projects/project1.html /project Redirect 301 /projects/project2.html /project Redirect 301 /projects /project Redirect 301 /keypersonnel.html /about-agrin/keystaff Redirect 301 /cooperation.htm /about-agrin/intcoop Redirect 301 /member.html /about-agrin/memberships Redirect 301 /contact.html /contacts Redirect 301 /hr.htm /jobs Redirect 301 /index.php/404 /index.php

    Read the article

  • How can I make WWW:Mechanize to not fetch pages twice?

    - by planetp
    I have a web scraping application, written in OO perl. There's single WWW::Mechanize object used in the app. How can I make it to not fetch the same url twice, i.e. make the second get() with the same url nop: my $mech = WWW::Mechanize->new(); my $url = 'http:://google.com'; $mech->get( $url ); # first time, fetch $mech->get( $url ); # same url, do nothing

    Read the article

  • some issues with removing www and redirecting index.html

    - by MariaKeys
    Hello Fellas, I am having trouble doing what i want to do with the following setup. I would like to remove all WWW, and also forward index.html to root dir. I would like this to be for all domains, so i am doing inside httpd.conf directory directive. I tried many variations with no success. Latest version is below (domains are inside /var/www/html, in seperate directories). http://www.example.com/index.html > http://example.com http://www.example.com/someother/index.html > http://example.com/someother/ Thanks, Maria <Directory "/var/www/html/*/"> RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] #RewriteCond %{REQUEST_URI} /^index\.html/ RewriteRule ^(.*)index\.html$ / [R=301,L] Options ExecCGI Includes FollowSymLinks AllowOverride AuthConfig AllowOverride All Order allow,deny Allow from all </Directory>

    Read the article

  • Apache2 configuration, .htacces and 310 error (www redirection)

    - by allstat
    I have an ubuntu apache serveur, with many websites. all my website have the same bug ( so it's look like a misconfiguration) http://www.2sigma.fr <- it's work fine ( we see "en travaux") http://2sigma.fr <- dont work, i got 310 error (cyclic redirection!) here my .htaccess Options +FollowSymlinks RewriteEngine on RewriteCond %{HTTP_HOST} ^2sigma\.fr$ RewriteRule ^(.*) http://www.2sigma.fr/$1 [R=301,L] here my confguration <VirtualHost *:80> <IfModule mpm_itk_module> AssignUserId sigma www-data </IfModule> ServerAdmin [email protected] ServerName 2sigma.fr ServerAlias www.2sigma.fr DocumentRoot /home/sigma/www <Directory /> Options FollowSymLinks AllowOverride All </Directory> <Directory /home/sigma/www> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog /var/log/apache2/error_sigma # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access_sigma combined ServerSignature Off If i use this .htaccess it's work fine : Options +FollowSymlinks RewriteEngine on RewriteCond %{HTTP_HOST} ^2sigma\.fr$ RewriteRule ^(.*) http://www.google.fr/$1 [R=301,L] I think that it is a apache configuration probleme... but i dont kno how to solve it. Thanks for your help

    Read the article

  • How can I handle UTF-8 while posting to a vBulletin board with WWW::Mechanize?

    - by MrMirror
    I have a problem with some automating posting to bulletin board... If I send the posting form to the vBulletin board, I get corrupted entities. Feel free to copy-paste the script and try it... It looks like the board's expecting some decoded utf8, but if I send the message decoded the entities are still wrong. #!/usr/bin/perl use strict; use warnings; use WWW::Mechanize; use Digest::MD5 qw(md5_hex); my $mech = WWW::Mechanize->new(); my $base_url = 'http://www.boerse.bz/'; my $username = 'MrMirror'; my $password = 'test'; $mech->get($base_url); print "Login\n"; $mech->form_number(1); $mech->field('vb_login_username' => $username); $mech->field('vb_login_password' => $password); $mech->field('vb_login_md5password' => md5_hex($password)); $mech->field('vb_login_md5password_utf' => md5_hex($password)); $mech->submit(); unless ($mech->content() =~ m!Weiterleitung!gi) { print "No Rediction!\n"; exit; } print "Redict\n"; $mech->get($base_url); unless ($mech->content() =~ m!Logout!gi) { print "Login Failed!\n"; exit; } $mech->get($base_url .'/newthread.php?do=newthread&f=173'); $mech->form_number(3); $mech->field('subject' => 'MrMirror makes some testing ä ö ü ß'); $mech->field('message' => "ä ö ü ß"); ### everything allright here $mech->dump_forms(); ### preview submit, don't wanna spam around ;) $mech->click('preview'); print "\n\n\n---------------------------------------------------------------------\n\n\n"; ### same form, wrong entities :( $mech->dump_forms();

    Read the article

  • Why can't Perl's WWW::Machanize find the form by field names?

    - by FRESHTER
    #!/usr/bin/perl use WWW::Mechanize; use Compress::Zlib; my $mech = WWW::Mechanize->new(); my $username = ""; #fill in username here my $keyword = ""; #fill in password here my $mobile = $ARGV[0]; my $text = $ARGV[1]; $deb = 1; print length($text)."\n" if($deb); $text = $text."\n\n\n\n\n" if(length($text) < 135); $mech->get("http://wwwl.way2sms.com/content/index.html"); unless($mech->success()) { exit; } $dest = $mech->response->content; print "Fetching...\n" if($deb); if($mech->response->header("Content-Encoding") eq "gzip") { $dest = Compress::Zlib::memGunzip($dest); $mech->update_html($dest); } $dest =~ s/<form name="loginForm"/<form action='..\/auth.cl' name="loginForm"/g; $mech->update_html($dest); $mech->form_with_fields(("username","password")); $mech->field("username",$username); $mech->field("password",$keyword); print "Loggin...\n" if($deb); $mech->submit_form(); $dest= $mech->response->content; if($mech->response->header("Content-Encoding") eq "gzip") { $dest = Compress::Zlib::memGunzip($dest); $mech->update_html($dest); } $mech->get("http://wwwl.way2sms.com//jsp/InstantSMS.jsp?val=0"); $dest= $mech->response->content; if($mech->response->header("Content-Encoding") eq "gzip") { $dest = Compress::Zlib::memGunzip($dest); $mech->update_html($dest); } print "Sending ... \n" if($deb); $mech->form_with_fields(("MobNo","textArea")); $mech->field("MobNo",$mobile); $mech->field("textArea",$text); $mech->submit_form(); if($mech->success()) { print "Done \n" if($deb); } else { print "Failed \n" if($deb); exit; } $dest = $mech->response->content; if($mech->response->header("Content-Encoding") eq "gzip") { $dest = Compress::Zlib::memGunzip($dest); #print $dest if($deb); } if($dest =~ m/successfully/sig) { print "Message sent successfully" if($deb); } exit; When run this code halts with an error saying: There is no form with the requested fields at ./sms.pl line 65 Can't call method "value" on an undefined value at /usr/share/perl5/vendor_perl/WWW/Mechanize.pm line 1348.

    Read the article

  • 301 redirect root domain to www subdomain on godaddy windows hosting account

    - by Greg
    If I type in domain.com and www.domain.com, they both show the same website, but show different urls in the address bar. I'd like visitors and search engines that just type "domain.com" to be redirected to "www.domain.com". I'm using IIS 7 on a godaddy hosting account. How do I redirect all requests for "domain.com" to "www.domain.com"? I have the default DNS setup, "domain.com" as my "A record" and the cname "www" points to my "A record".

    Read the article

  • Perl - WWW::Mechanize Cookie Session Id is being reset with every get(), how to make it stop?

    - by Phill Pafford
    So I'm scraping a site that I have access to via HTTPS, I can login and start the process but each time I hit a new page (URL) the cookie Session Id changes. How do I keep the logged in Cookie Session Id? #!/usr/bin/perl -w use strict; use warnings; use WWW::Mechanize; use HTTP::Cookies; use LWP::Debug qw(+); use HTTP::Request; use LWP::UserAgent; use HTTP::Request::Common; my $un = 'username'; my $pw = 'password'; my $url = 'https://subdomain.url.com/index.do'; my $agent = WWW::Mechanize->new(cookie_jar => {}, autocheck => 0); $agent->{onerror}=\&WWW::Mechanize::_warn; $agent->agent('Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.3) Gecko/20100407 Ubuntu/9.10 (karmic) Firefox/3.6.3'); $agent->get($url); $agent->form_name('form'); $agent->field(username => $un); $agent->field(password => $pw); $agent->click("Log In"); print "After Login Cookie: "; print $agent->cookie_jar->as_string(); print "\n\n"; my $searchURL='https://subdomain.url.com/search.do'; $agent->get($searchURL); print "After Search Cookie: "; print $agent->cookie_jar->as_string(); print "\n"; The output: After Login Cookie: Set-Cookie3: JSESSIONID=367C6D; path="/thepath"; domain=subdomina.url.com; path_spec; secure; discard; version=0 After Search Cookie: Set-Cookie3: JSESSIONID=855402; path="/thepath"; domain=subdomain.com.com; path_spec; secure; discard; version=0 Also I think the site requires a CERT (Well in the browser it does), would this be the correct way to add it? $ENV{HTTPS_CERT_FILE} = 'SUBDOMAIN.URL.COM'; ## Insert this after the use HTTP::Request... Also for the CERT In using the first option in this list, is this correct? X.509 Certificate (PEM) X.509 Certificate with chain (PEM) X.509 Certificate (DER) X.509 Certificate (PKCS#7) X.509 Certificate with chain (PKCS#7)

    Read the article

  • Why am I getting a new session ID on every page fetch in my Perl WWW::Mechanize script?

    - by Phill Pafford
    So I'm scraping a site that I have access to via HTTPS, I can login and start the process but each time I hit a new page (URL) the cookie Session Id changes. How do I keep the logged in Cookie Session Id? #!/usr/bin/perl -w use strict; use warnings; use WWW::Mechanize; use HTTP::Cookies; use LWP::Debug qw(+); use HTTP::Request; use LWP::UserAgent; use HTTP::Request::Common; my $un = 'username'; my $pw = 'password'; my $url = 'https://subdomain.url.com/index.do'; my $agent = WWW::Mechanize->new(cookie_jar => {}, autocheck => 0); $agent->{onerror}=\&WWW::Mechanize::_warn; $agent->agent('Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.3) Gecko/20100407 Ubuntu/9.10 (karmic) Firefox/3.6.3'); $agent->get($url); $agent->form_name('form'); $agent->field(username => $un); $agent->field(password => $pw); $agent->click("Log In"); print "After Login Cookie: "; print $agent->cookie_jar->as_string(); print "\n\n"; my $searchURL='https://subdomain.url.com/search.do'; $agent->get($searchURL); print "After Search Cookie: "; print $agent->cookie_jar->as_string(); print "\n"; The output: After Login Cookie: Set-Cookie3: JSESSIONID=367C6D; path="/thepath"; domain=subdomina.url.com; path_spec; secure; discard; version=0 After Search Cookie: Set-Cookie3: JSESSIONID=855402; path="/thepath"; domain=subdomain.com.com; path_spec; secure; discard; version=0 Also I think the site requires a CERT (Well in the browser it does), would this be the correct way to add it? $ENV{HTTPS_CERT_FILE} = 'SUBDOMAIN.URL.COM'; ## Insert this after the use HTTP::Request... Also for the CERT In using the first option in this list, is this correct? X.509 Certificate (PEM) X.509 Certificate with chain (PEM) X.509 Certificate (DER) X.509 Certificate (PKCS#7) X.509 Certificate with chain (PKCS#7)

    Read the article

  • A/UX cc compiler errors on trivial code: "declared argument argc is missing"

    - by Fzn
    On a quite ancient UNIX (Apple A/UX 3.0.1 for 680x0 processors) using the built-in c compiler (cc), this issue arrises. Here is the code I'm trying to compile: #include <stdlib.h> #include <stdio.h> int main() int argc; char **argv; { if (argc > 1) puts(argv[1]); return (EXIT_SUCCESS); } And here is the output I get: pigeonz.root # cc -c test.c "test.c", line 5: declared argument argc is missing "test.c", line 6: declared argument argv is missing Using a more modern prototype did not help, nor did the manual page, nor a quick google search. What am I doing wrong?

    Read the article

  • Ubuntu User WWW/FTP

    - by SnIpY
    I have a user named 'user' which I use to login to the ftp of my website. However, this presents me with a problem. If I want to allow my user to access to ftp, I have to typ the following: chown -R user:ftpusers /var/www/ By doing this, my website is no longer available when surfing to it. To make it available again, I have to typ the following command: chown -R www-data:www-data /var/www/ the user 'user' is in both the ftpusers and www-data group. How can I fix this so I wouldn't have to typ this anymore? I'm using apache2 and vsftpd on ubuntu

    Read the article

  • Basic clarification about Limited FTP/sFTP users

    - by mattewre
    I would like to get some clarification about the correct way to create limited users to access to my VPS user as WEBSERVER with Nginix. I'm used to NOT install FTP and access via SFTP only. It is ok for every set up? this is what I usually do from to create a limited user called "admin" that should be able to have access via SFTP to the folder with the website data mkdir -p /var/www/mysite.com/ adduser admin adduser admin www-data chown -R root:root /var/www chmod -R 755 /var/www chmod -R 755 /var/www/mysite.com chown -R admin:www-data /var/www/mysite.com/ It seems not to be the correct way, I always have problems with permission when I upload some files (for example with Wordpress in general). I would like to create an user that does work exactly as the one that the "provides" give to their client when they buy an Hosting service (that is a FTP, I would prefer SFTP access). It is for personal user, but I think that a limited user is a lot safer to use then the "root" via SFTP.

    Read the article

  • Who should own /var/www? [closed]

    - by John
    Possible Duplicate: How should I structure my users/groups/permissions for a web server? I've seen a few answers to this on the internet, but I'm looking for a definitive answer. I have a new Ubuntu 12.04 LTS server with LAMP. Apache is set to run as "www-data" and /var/www is set as having "root" as the owner and "root" as the group. The permissions for /var/www are "drwxr-xr-x" which I believe translates to 755 numerically. I know that /var/www should not be owned by "www-data" because then buggy/malicious code could have a field day. However, should I keep it as root:root (inconvenient) or should I change it to ubuntu:ubuntu, the default user that Ubuntu preconfigures for you to log in with? Should the permissions remain at 755? I've been administrating systems for a while with no big security issues, but I'm trying to get really serious about security, double-check everything, and make sure that there are no gaps in my knowledge.

    Read the article

  • .htaccess 301 redirect with regex?

    - by Eddie ZA
    How to do this with regular expression? Old -> New http://www.example.com/1.html -> http://www.example.com/dir/1.html http://www.example.com/2.html -> http://www.example.com/dir/2.html http://www.example.com/3.asp -> http://www.example.com/dir/3.html http://www.example.com/4.asp -> http://www.example.com/dir/4.html http://www.example.com/4_a.html -> http://www.example.com/dir/sub/4-a.html http://www.example.com/4_b.html -> http://www.example.com/dir/sub/4-b.html I've tried this: Redirect 301 /1.html http://www.example.com/dir/1.html Redirect 301 /2.html http://www.example.com/dir/2.html Redirect 301 /3.asp http://www.example.com/dir/3.html Redirect 301 /4.asp http://www.example.com/dir/4.html Redirect 301 /4_a.html http://www.example.com/dir/sub/4-a.html Redirect 301 /4_b.html http://www.example.com/dir/sub/4-b.html

    Read the article

  • wildcard deal with www as a subdomain

    - by Alaa Gamal
    i am using wildcard with apache my APACHE CONFIG: ServerAlias *.staronece1.com DocumentRoot /staronece1/domains my named file $ttl 38400 staronece1.com. IN SOA staronece1.com. email.yahoo.com. ( 1334838782 10800 3600 604800 38400 ) staronece1.com. IN NS staronece1.com. staronece1.com. IN A 95.19.203.21 www.staronece1.com. IN A 95.19.203.21 server.staronece1.com. IN A 95.19.203.21 mail.staronece1.com. IN A 95.19.203.21 ns1.staronece1.com. IN A 95.19.203.21 ns2.staronece1.com. IN A 95.19.203.21 staronece1.com. IN NS ns1.staronece1.com. staronece1.com. IN NS ns2.staronece1.com. staronece1.com. IN MX 10 mail.staronece1.com. * 14400 IN A 95.19.203.21 *.staronece1.com IN A 95.19.203.21 my php test file /staronece1/domains/index.php <?php function getBname(){ $bname=explode(".",$_SERVER['HTTP_HOST'],2); return $bname[0]; } echo 'SubDomain is :'.getBname(); ?> if i go to something.staronece1.com i get this result SubDomain is : something No the problem is if i go to www.staronece1.com i should get empty result, because www is not a sub domain but i get this result SubDomain is : www And if i go to www.something.staronece1.com i get firefox error message ( site not found ) How to fix this problem?? i think the solution is: added record for www in named file Thanks

    Read the article

  • Redirecting a single request to another pages, ignoring www subdomain

    - by Petter Brodin
    I have a site running on IIS 7.5 that does an automatic redirect from 'http://mysite.com/whatever.aspx' to 'http://www.mysite.com/whatever.aspx' On the site, there is a lot of traffic to an old URL that I want to redirect to the front page, index.aspx: 'http://mysite.com/foo/bar/index.cgi%something=asdf&somethingelse=qwerty' The problem is that no matter what I try, I can only get the redirect to work with the www subdomain. If I use the URL without www, I just end up at 'http://www.mysite.com/404.aspx' Any ideas? Thanks in advance for all help! Edit3: it seems like the browser caching the redirect response was messing with me, so edit2 is wrong. See my response below. Edit2: disregard edit1, it doesn't seem like it's working after all. Edit: here's some further info: using this article I've managed to redirect from 'http://mysite.com/foo/bar/index.cgi' to 'http://www.mysite.com/index.aspx', but if I add the query string parameters, it still redirects to 'http://www.mysite.com/404.aspx' Isn't there a way to catch all requests to the cgi file, including query string parameters?

    Read the article

  • non www .htaccess redirect - ignore other subdomains.

    - by qxxx
    Hi, I have a .htaccess redirect for "non www" like this: RewriteEngine on RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] it is working. But, i have also some subdomains other than www. If I call for example http://shop.example.com it redirects me to: http://www.shop.example.com I dont want to write the subdomains into the .htaccess file, it should work automatically, just ignore anything else than www and '' like this: if subdomain =='' -> redirect to www.HTTP_HOST.... elseif subdomain !='' && subdomain !='www' -> do nothing. thanks!

    Read the article

  • Blacklist for To/CC in Thunderbird

    - by Robert Munteanu
    I have Thunderbird set up to auto-bcc my account for all emails, and disabled the 'Sent' folder. Unfortunately if I'm in an email threads and Reply to all I get double-copied: once by Reply to all; once bt auto-bcc. Are there any workarounds for this?

    Read the article

  • Speaking at UK DevConnections in www.DevConnections.com/uk and discount codes

    - by Testas
     HiI will be speaking at the UK DevConnections on Analysis Services at the ExCeL conference centre in London on 13th-15th June 2011. Join top SQL Server names such as Paul Randall, Kimberley Tripp, Simon Sabin and Allan Mitchell ( to name a few), at the IT & DevConnections powered by Microsoft UK  in London on 13th-15th June 2011. With UK DevConnections you can combine SQL Sessions with other Microsoft technology stacks. Microsoft and leading independent industry presenters will deliver in-depth presentations and cutting edge sessions on  SharePoint Windows Exchange and Unified Communications SQL Server Silverlight ASP.Net Virtualisation Cloud and Azure As a speaker I have a discount code that entitles give 20% off the cost to register for IT&DEvConnections in June.  The code is sql-bits, if people register before the 31st March when the Super Early Bird offer ends you will only pay £639.20 +vat (normal price £999)Furthermore, there are preferential hotel rates for this event at: https://logicalvenues.vbookings.co.uk/b/pentonlondon0611/  So if you want to attend a conference with a wide spectrum of technologies, then DevConnections may be up your street Thanks  Chris

    Read the article

  • Silverlight TV 23: MVP Q&A with WWW (Wildermuth, Wahlin and Ward)

    John interviews a panel of 3 well known Silverlight leaders including Shawn Wildermuth, Dan Wahlin, and Ward Bell at the Silverlight 4 launch event. The guest panel answers questions sent in from Twitter about the features in Silverlight 4, thoughts on MVVM, and the panel members' experiences developing Silverlight. This is a great chance to hear from some of the leading Silverlight minds. These guys are all experts at building business applications with Silverlight. Relevant links: John's Blog...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • WWW - Why Websites Work

    In this ever changing world of digital domination, the internet is the most effective way to communicate to others, your ideas, products, services, or information. Learn why having a website is so important for effective online marketing and education.

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >