Search Results

Search found 22083 results on 884 pages for 'display templates'.

Page 520/884 | < Previous Page | 516 517 518 519 520 521 522 523 524 525 526 527  | Next Page >

  • Ubuntu can't find the correct max resolution with Samsung SyncMaster SA300

    - by fatmatto
    i decided to install ubuntu also on my desktop PC (Windows has been exorcised from my life) but i am having some problems i didn't have with previous hardware configurations. My display is a Samsung SyncMaster SA300, on windows vista the maximum resolution (1920x1080) worked well, but now, ubuntu (after installing fglrx drivers) tells me that the maximum resolution is 1600x1200 I googled a lot last night, and i found a lot of people solving this (on different displays though) with xrandr. I was not able to do it, because xrandr keep complaining "you goddamn maximum resolution is 1600x1600". What xranrd clean command say is: mattia@fatdesktop:~$ xrandr Screen 0: minimum 320 x 200, current 1600 x 1200, maximum 1600 x 1600 DFP1 disconnected (normal left inverted right x axis y axis) CRT1 disconnected (normal left inverted right x axis y axis) CRT2 connected 1600x1200+0+0 (normal left inverted right x axis y axis) 0mm x 0mm 1600x1200 60.0*+ 1400x1050 60.0 1280x1024 60.0 47.0 43.0 1440x900 59.9 1280x960 60.0 1280x800 60.0 1152x864 60.0 47.0 43.0 1280x768 59.9 56.0 1280x720 60.0 50.0 1024x768 60.0 43.5 800x600 60.3 56.2 47.0 720x576 50.0 720x480 60.0 640x480 60.0 TV disconnected (normal left inverted right x axis y axis) CV disconnected (normal left inverted right x axis y axis) Then according to other internet posts and forums: mattia@fatdesktop:~$ cvt 1920 1080 60 # 1920x1080 59.96 Hz (CVT 2.07M9) hsync: 67.16 kHz; pclk: 173.00 MHz Modeline "1920x1080_60.00" 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync So now i have to add that modeline mattia@fatdesktop:~$ xrandr --newmode "1920x1080_60.00" 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync mattia@fatdesktop:~$ xrandr --addmode CRT2 1920x1080_60.00 And here comes the pain: mattia@fatdesktop:~$ xrandr --output CRT2 --mode 1920x1080_60.00 xrandr: **screen cannot be larger than 1600x1600 (desired size 1920x1080)** See? screen cannot be larger than 1600x1600 (desired size 1920x1080) At this point, the 1920x1080 option appears inside the resolution choice menu (the graphical one). But last night, when i tried to select it, my screen went black, and i had to power off the pc. Any clues? am i on the wrong path?

    Read the article

  • Returning row values based on conditional formatting variables

    - by Mike Bodes
    I'm not entirely sure how to properly explain this, but here we go... I'm trying to create a single budgeting document that allows me to manage purchasing and reconciliation for multiple projects. I would like to create separate sheets per project and have purchased items populate on a master sheet. Using conditional formatting, I've set one of the columns to display an item's status (waiting for approval, approved, ordered, received). I would like the contents of an entire row to populate in a new sheet table once the status is set to "Received." The sheet should update descendingly. I can't attach an image because I don't have a 10 reputation.. Any help is greatly appreciated.

    Read the article

  • XBMC: Viewing podcasts in 'Library Mode'

    - by greggannicott
    I'm having a great time getting to know XBMC, and so far for the most part I've been really happy with the results. I was chuffed when I followed the advice on this SU post and added TWiT as a video podcast with ease. However, when I go into Library Mode I can no longer access the podcasts I've added. I realise that one simple work-around is to come out of Library Mode to view podcasts, but in order to keep everything as simple (and appealing) to my wife as possible I'd rather remain in Library Mode so that on the rare occasion she wants to watch a DVD, she can do so without my help. Does anyone know a way to display podcasts in library mode? If this isn't possible is there a more elegant solution/work-around to going back and forth between library mode? Many thanks.

    Read the article

  • tfs 2010 RC Agile Process template update New Task progress report

    Maybe my next post will just be about why I am so excited and impressed with the out of the box templates.  But, for this first blog with my new focus, I thought I would just walk through the process I went through to create a task progress report (to enhance the out of the box Agile template). So, I started with the MSF for Agile Development 5.0 RC template.  After reviewing the template, I came away pretty excited about many of the new reports.  I am especially excited about the reporting services reports.  The big advantage I see here is that these are querying the Warehouse directly instead of the Analysis Services Cube which means that they are much closer to real-time which I find very important for reports like Burndown and task status.  One report that I focused on right away was the User Story Progress Report.  An overview is shown below: This report is very useful, but a lot of our internal managers really prefer to manage at the task level and either dont have stories in TFS or would like to view this type of report for tasks in addition to the User Stories.  So, what did I do? Step 1: Download the Agile Template In VS 2010 RC, open Process Template Manager from Team->Team Project Collection Settings.  Download the MSF for Agile Development template to your local file system.  A project template is a folder of xml files.  There is a ProcessTemplate.xml in the root and then a bunch of directories for things like Work Item Definitions and Queries, Reports, Shared Documents and Source Control Settings.  Step 2: Copy the folder My plan here is to make a new template with all of my modifications.  You can also just enhance update the MSF template.  However, I think it is cleaner when you start making modifications to make your own template.  So, copy the folder and name it with your new template name. Step 3: Change Template Name Open ProcessTemplate.xml and change the <name> of the template. Step 4: Copy the rdl of the Report you want to use a starting point In my case, I copied Stories Progress.rdl and named the file Task Progress Breakdown.rdl.  I reviewed the requirements for the new report with some of the users here and came up with this plan.  Should show tasks and be expandable to show subtasks.  Should add Assigned To and Estimated Finish Date as 2 extra columns. Step 5: Walkthrough the existing report to understand how it works The main thing that I do here is try to get the sql to run in SQL Management Studio.  So, I can walkthrough the process of building up the data for the report. After analyzing this particular report I found a couple of very useful things.  One, this report is already built to display subtasks if I just flip the IncludeTasks flag to 1.  So, if you are using Stories and have tasks assigned to each story.  This might give you everything you want.  For my purposes, I did make that change to the Stories Progress report as I find it to be a more useful report to be able to see the tasks that comprise each story.  But, I still wanted a task only version with the additional fields. Step 6: Update the report definition I tend to work on rdl in visual studio directly as xml.  Especially when I am just altering an existing report, I find it easier than trying to deal with the BI Studio designer.  For my report I made the following changes. Updated Fields Removed Stack Rank and Replaced with Priority since we dont use Stack Rank Added FinishDate and AssignedTo Changed the root deliverable SQL to pull @tasks instead of @deliverablecategory and added a join CurrentWorkItemView for FinishDate and Assigned to SELECT cwi.[System_Id] AS ID FROM [CurrentWorkItemView] cwi             WHERE cwi.[System_WorkItemType] IN (@Task)             AND cwi.[ProjectNodeGUID] = @ProjectGuid SELECT lh.SourceWorkItemID AS ID FROM FactWorkItemLinkHistory lh             INNER JOIN [CurrentWorkItemView] cwi ON lh.TargetWorkItemID = cwi.[System_Id]             WHERE lh.WorkItemLinkTypeSK = @ParentWorkItemLinkTypeSK                 AND lh.RemovedDate = CONVERT(DATETIME, '9999', 126)                 AND lh.TeamProjectCollectionSK = @TeamProjectCollectionSK                 AND cwi.[System_WorkItemType] NOT IN (@DeliverableCategory) Added AssignedTo and FinishDate columns to the @Rollups table Added two columns to the table used for column headers <Tablix Name="ProgressTable">         <TablixBody>           <TablixColumns>             <TablixColumn>               <Width>2.7625in</Width>             </TablixColumn>             <TablixColumn>               <Width>0.5125in</Width>             </TablixColumn>             <TablixColumn>               <Width>3.4625in</Width>             </TablixColumn>             <TablixColumn>               <Width>0.7625in</Width>             </TablixColumn>             <TablixColumn>               <Width>1.25in</Width>             </TablixColumn>             <TablixColumn>               <Width>1.25in</Width>             </TablixColumn>           </TablixColumns> Added Cells for the two new headers Added Cells to the data table to include the two new values (Assigned to & Finish Date) Changed a bunch of widths that would change the format of the report to display landscape and have room for the two additional columns Set the Value of the IncludeTasks Parameter to 1 <ReportParameter Name="IncludeTasks">       <DataType>Integer</DataType>       <DefaultValue>         <Values>           <Value>=1</Value>         </Values>       </DefaultValue>       <Prompt>IncludeTasks</Prompt>       <Hidden>true</Hidden>     </ReportParameter> Change a few descriptions on how the report should be used This is the resulting report I have attached the final rdl. Step 7: Update ReportTasks.xml Last step before the template is ready for use is to update the reportTasks.xml file in the reports folder.  This file defines the reports that are available in the template.           <report name="Task Progress Breakdown" filename="Reports\Task Progress Breakdown.rdl" folder="Project Management" cacheExpiration="30">             <parameters>               <parameter name="ExplicitProject" value="" />             </parameters>             <datasources>               <reference name="/Tfs2010ReportDS" dsname="TfsReportDS" />             </datasources>           </report> Step 8: Upload the template Open the process Template Manager just like Step 1.  And upload the new template. Thats it.  One other note, if you want to add this report to existing team project you will have to go into reportmanager (the reporting services portal) and upload the rdl to that projects directory.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SQL SERVER – Removing Leading Zeros From Column in Table – Part 2

    - by pinaldave
    Earlier I wrote a blog post about Remvoing Leading Zeros from Column In Table. It was a great co-incident that my friend Madhivanan (no need of introduction for him) also post a similar article over on BeyondRelational.com. I strongly suggest to read his blog as well as he has suggested some cool solutions to the same problem. On original blog post asked two questions 1) if my sample for testing is correct and 2) If there is any better method to achieve the same. The response was amazing. I am proud on our SQL Community that we all keep on improving on each other’s contribution. There are some really good suggestions as a comment. Let us go over them right now. Improving the ResultSet I had missed including all zeros in my sample set which was an overlook. Here is the new sample which includes all zero values as well. USE tempdb GO -- Create sample table CREATE TABLE Table1 (Col1 VARCHAR(100)) INSERT INTO Table1 (Col1) SELECT '0001' UNION ALL SELECT '000100' UNION ALL SELECT '100100' UNION ALL SELECT '000 0001' UNION ALL SELECT '00.001' UNION ALL SELECT '01.001' UNION ALL SELECT '0000' GO Now let us go over some of the fantastic solutions which we have received. Response from Rainmaker SELECT CASE PATINDEX('%[^0 ]%', Col1 + ' ‘') WHEN 0 THEN '' ELSE SUBSTRING(Col1, PATINDEX('%[^0 ]%', Col1 + ' '), LEN(Col1)) END FROM Table1 Response from Harsh Solution 1 SELECT SUBSTRING(Col1, PATINDEX('%[^0 ]%', Col1 + 'a'), LEN(Col1)) FROM Table1 Response from Harsh Solution 2 SELECT RIGHT(Col1, LEN(Col1)+1 -PATINDEX('%[^0 ]%', Col1 + 'a' )) FROM Table1 Response from lucazav SELECT T.Col1 , label = CAST( CAST(REPLACE(T.Col1, ' ', '') AS FLOAT) AS VARCHAR(10)) FROM Table1 AS T Response from iamAkashSingh SELECT REPLACE(LTRIM(REPLACE(col1,'0',' ')),' ','0') FROM table1 Here is the resultset of above scripts. It will remove any leading zero or space and will display the number accordingly. If you believe there is a better solution, please leave a comment. I am just glad to see so many various responses and all of them teach us something new. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Inter Quake Model IQM render Directx9

    - by Andrew_0
    I'm trying to render an Inter Quake Model(http://lee.fov120.com/iqm/) in DirectX9 that I exported from blender. I want to display animations which IQM supports and my model format does not. The model is a cylinder. It loads fine in the iqm sdk opengl viewer but when i try to render it in directx9 using for example(this is just to render the vertices): IDirect3DDevice9 * device; HRESULT hr = S_OK; for(int i = 0; i < nummeshes; i++) { iqmmesh &m = meshes[0]; hr = device->DrawIndexedPrimitiveUP(D3DPT_TRIANGLELIST, 0, 3*m.num_triangles, m.num_triangles ,&tris[m.first_triangle] ,D3DFMT_INDEX32 ,inposition ,sizeof(unsigned int)); } It renders like this: Incorrect The light grey bit that looks like two triangles in the middle is what is rendered(ignore the other stuff). Whereas it is meant to look like this(using a custom importer which I designed which matches what is displayed in blender): Correct Anyone have any suggestions on what might be going wrong?

    Read the article

  • WPF Login Verification Using Active Directory

    - by psheriff
    Back in October of 2009 I created a WPF login screen (Figure 1) that just showed how to create the layout for a login screen. That one sample is probably the most downloaded sample we have. So in this blog post, I thought I would update that screen and also hook it up to show how to authenticate your user against Active Directory. Figure 1: Original WPF Login Screen I have updated not only the code behind for this login screen, but also the look and feel as shown in Figure 2. Figure 2: An Updated WPF Login Screen The UI To create the UI for this login screen you can refer to my October of 2009 blog post to see how to create the borderless window. You can then look at the sample code to see how I created the linear gradient brush for the background. There are just a few differences in this screen compared to the old version. First, I changed the key image and instead of using words for the Cancel and Login buttons, I used some icons. Secondly I added a text box to hold the Domain name that you wish to authenticate against. This text box is automatically filled in if you are connected to a network. In the Window_Loaded event procedure of the winLogin window you can retrieve the user’s domain name from the Environment.UserDomainName property. For example: txtDomain.Text = Environment.UserDomainName The ADHelper Class Instead of coding the call to authenticate the user directly in the login screen I created an ADHelper class. This will make it easier if you want to add additional AD calls in the future. The ADHelper class contains just one method at this time called AuthenticateUser. This method authenticates a user name and password against the specified domain. The login screen will gather the credentials from the user such as their user name and password, and also the domain name to authenticate against. To use this ADHelper class you will need to add a reference to the System.DirectoryServices.dll in .NET. The AuthenticateUser Method In order to authenticate a user against your Active Directory you will need to supply a valid LDAP path string to the constructor of the DirectoryEntry class. The LDAP path string will be in the format LDAP://DomainName. You will also pass in the user name and password to the constructor of the DirectoryEntry class as well. With a DirectoryEntry object populated with this LDAP path string, the user name and password you will now pass this object to the constructor of a DirectorySearcher object. You then perform the FindOne method on the DirectorySearcher object. If the DirectorySearcher object returns a SearchResult then the credentials supplied are valid. If the credentials are not valid on the Active Directory then an exception is thrown. C#public bool AuthenticateUser(string domainName, string userName,  string password){  bool ret = false;   try  {    DirectoryEntry de = new DirectoryEntry("LDAP://" + domainName,                                           userName, password);    DirectorySearcher dsearch = new DirectorySearcher(de);    SearchResult results = null;     results = dsearch.FindOne();     ret = true;  }  catch  {    ret = false;  }   return ret;} Visual Basic Public Function AuthenticateUser(ByVal domainName As String, _ ByVal userName As String, ByVal password As String) As Boolean  Dim ret As Boolean = False   Try    Dim de As New DirectoryEntry("LDAP://" & domainName, _                                 userName, password)    Dim dsearch As New DirectorySearcher(de)    Dim results As SearchResult = Nothing     results = dsearch.FindOne()     ret = True  Catch    ret = False  End Try   Return retEnd Function In the Click event procedure under the Login button you will find the following code that will validate the credentials that the user types into the login window. C#private void btnLogin_Click(object sender, RoutedEventArgs e){  ADHelper ad = new ADHelper();   if(ad.AuthenticateUser(txtDomain.Text,         txtUserName.Text, txtPassword.Password))    DialogResult = true;  else    MessageBox.Show("Unable to Authenticate Using the                      Supplied Credentials");} Visual BasicPrivate Sub btnLogin_Click(ByVal sender As Object, _ ByVal e As RoutedEventArgs)  Dim ad As New ADHelper()   If ad.AuthenticateUser(txtDomain.Text, txtUserName.Text, _                         txtPassword.Password) Then    DialogResult = True  Else    MessageBox.Show("Unable to Authenticate Using the                      Supplied Credentials")  End IfEnd Sub Displaying the Login Screen At some point when your application launches, you will need to display your login screen modally. Below is the code that you would call to display the login form (named winLogin in my sample application). This code is called from the main application form, and thus the owner of the login screen is set to “this”. You then call the ShowDialog method on the login screen to have this form displayed modally. After the user clicks on one of the two buttons you need to check to see what the DialogResult property was set to. The DialogResult property is a nullable type and thus you first need to check to see if the value has been set. C# private void DisplayLoginScreen(){  winLogin win = new winLogin();   win.Owner = this;  win.ShowDialog();  if (win.DialogResult.HasValue && win.DialogResult.Value)    MessageBox.Show("User Logged In");  else    this.Close();} Visual Basic Private Sub DisplayLoginScreen()  Dim win As New winLogin()   win.Owner = Me  win.ShowDialog()  If win.DialogResult.HasValue And win.DialogResult.Value Then    MessageBox.Show("User Logged In")  Else    Me.Close()  End IfEnd Sub Summary Creating a nice looking login screen is fairly simple to do in WPF. Using the Active Directory services from a WPF application should make your desktop programming task easier as you do not need to create your own user authentication system. I hope this article gave you some ideas on how to create a login screen in WPF. NOTE: You can download the complete sample code for this blog entry at my website: http://www.pdsa.com/downloads. Click on Tips & Tricks, then select 'WPF Login Verification Using Active Directory' from the drop down list. Good Luck with your Coding,Paul Sheriff ** SPECIAL OFFER FOR MY BLOG READERS **We frequently offer a FREE gift for readers of my blog. Visit http://www.pdsa.com/Event/Blog for your FREE gift!

    Read the article

  • Mini DisplayPort to DVI-D Dual Link up to 2560 * 1600

    - by Steinway Wu
    I have a ThinkPad T530 with Mini DP++ and NVS 5400M, which are said to support 2560 * 1600 resolution. The display is Dell 3007WFP, which also supports 2560 * 1600. But, I have never seen any cables or adapters that can connect them with full support to 2560 * 1600. Do you have any idea? THANK YOU! In addition, I found that mDP++ means I can use a passive adapter. No need for active adapter which is much more expensive. This is the only one that meets my needs, but it is not sold in the US. And this one seems to be OK, but I am not sure because it says "2.5 Gbps per channel", which may not be enough.

    Read the article

  • MySQL – Introduction to User Defined Variables

    - by Pinal Dave
    MySQL supports user defined variables to have some data that can be used later part of your query. You can save a value to a variable using a SELECT statement and later you can access its value. Unlike other RDBMSs, you do not need to declare the data type for a variable. The data type is automatically assumed when you assign a value. A value can be assigned to a variable using a SET command as shown below SET @server_type:='MySQL'; When you above command is executed, the value, MySQL is assigned to the variable called @server_type. Now you can use this variable in the later part of the code. Suppose if you want to display the value, you can use SELECT statement. SELECT @server_type; The result is MySQL. Once the value is assigned it remains for the entire session until changed by the later statements. So unlike SQL Server, you do not need to have this as part the execution code every time. (Because in SQL Server, the variables are execution scoped and dropped after the execution). You can give column name as below SELECT @server_type AS server_type; You can also SELECT statement to DECLARE and SELECT the values for a variable. SELECT @message:='Welcome to MySQL' AS MESSAGE; The result is Message -------- Welcome to MySQL You can make use of variables to effectively apply many logics. One of the useful method is to generate the row number as shown in this post MySQL – Generating Row Number for Each Row using Variable. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: MySQL, PostADay, SQL, SQL Authority, SQL Query, SQL Tips and Tricks, T SQL

    Read the article

  • Dell printer goes offline after second print job.

    - by Ac0ua
    Dell printer goes offline(server connection) after second print job. Although the printer's display says it is ready. If you turn off then on the printer you can send one job and goes back offline(server connection) on the next print job. We have multiple Dell 2330dn printers installed through a print server, only one of the printers is experiencing this problem. Two different users. Two different machines. Two different operating systems (win7 and Vista). The computers have been reset. Dell printers have web interface if this helps (through IP address). Thanks for any help!

    Read the article

  • MSCC: Purpose and benefits of Version Control Systems (VCS)

    Unfortunately, there was no monthly meetup during May. Which means that it was even more important and interesting to go forward with a great topic for this month. Earlier this year I already spoke to Nayar Joolfoo about doing a presentation on version control systems (VCS), and he gladly agreed since then. It was just about finding the right date for the action. Furthermore, it was also a great coincidence that Avinash Meetoo announced on social media networks that Knowledge 7 is about to have a new training on "Effective git" - which correlates to a book title Avinash is currently working on - all the best with your approach on this and reach out to our MSCC craftsmen for recessions. Once again a big Thank you to Orange Ebene Accelerator on providing the venue for us, and the MSCC members involved on securing the time slot for our event. Unfortunately, it's kind of tough to get an early confirmation for our meetups these days. I'll keep you posted on that one as there are some interesting and exciting options coming up soon. Okay, let's talk about the meeting and version control systems again. As usual, I'm going to put my first impression of the meetup: "Absolutely great topic, questions and discussions on version control systems, like git or VSO. I was also highly pleased by the number of first timers and female IT geeks. Hopefully, we will be able to keep this trend for future get-togethers." And I really have to emphasise the amount of fresh blood coming to our gathering. Also, during the initial phase it was surprising to see that exactly those first-timers, most of them students at various campuses here on the island, had absolutely no idea about version control systems. More about further down... Reactions of other attendees If I counted correctly, we had a total of 17 attendees this month, and I'd like to give you feedback from some of them: "Inspiring. Helped me understand more about GIT." -- Sean on event comments "Joined the meetup today with literally no idea what is a version control system. I have several reasons why I should be starting to use VCS as from NOW in my projects. Thanks Nayar, Jochen and other participants :)" -- Yudish on event comments "Was present today and I'm very satisfied.I was not aware if there was a such tool like git available. Thanks to those who contributed for this meetup.It was great. Learned a lot from this meetup!!" -- Leonardo on event comments "Seriously, I can see how it’s going to ease my task and help me save time. Gone are the issues with files backups.  And since I’ll be doing my dissertation this year, using Git would help me a lot for my backups and I’m grateful to Nayar for the great explanation." -- Swan-Iyah on MSCC meetup : Version Controls Hopefully, I'll be able to get some other sources - personal blogs preferred - on our meeting. Geeks, thank you so much for those encouraging comments. It's really great to experience that we, all members of the MSCC, are doing the right thing to get more IT information out, and to help each other to improve and evolve in our professional careers. Our agenda of the day Honestly, we had a bumpy start... First, I was battling a little bit with the movable room divider in order to maximize the space. I mean, we had 24 RSVPs and usually there might additional people coming along. Then, for what ever reason, we were facing power outages - actually twice in short periods. Not too good for the projector after all, but hey it went smooth for the rest of the time being. And last but not least... our first speaker Nayar got stuck somewhere on the road. ;-) Anyway, not a real show-stopper and we used the time until Nayar's arrival to introduce ourselves a little bit. It is always important for me to get to know the "newbies" a little bit, and as a result we had lots of students of university - first year, second year and recent graduates - among them. Surprisingly, none of them was ever in contact with version control systems at all. I mean, this is a shocking discovery! Similar to the ability of touch-typing I'd say that being able to use (and master) any kind of version control system is compulsory in any job in the IT industry. Seriously, I'm wondering what is being taught during the classes on the campus. All of them have to work on semester assessments or final projects, even in small teams of 2-4 people. That's the perfect occasion to get started with VCS. Already in this phase, we had great input from more experienced VCS users, like Sean, Avinash and myself. git - a modern approach to VCS - Nayar What a tour! Nayar gave us the full round of git from start to finish, even touching some more advanced techniques. First, he started to explain about the importance of version control systems as an essential tool for software developers, even working alone on a project, and the ability to have a kind of "time machine" that allows you to inspect and revert to a previous version of source code at any time. Then he showed how easy it is to install git on an Ubuntu based system but also mentioned that git is literally available for any operating system, like Windows, Mac OS X and of course other Linux distributions. Next, he showed us how to set the initial configuration values of user name and email address which simplifies the daily usage of the git client while working with your repositories. Then he initialised and added a new repository for some local development of a blogging software. All commands were done using the command line interface (CLI) so that they can be repeated on any system as reference. The syntax and the procedure is always the same, and Nayar clearly mentioned this to the attendees. Now, having a git repository in place it was about time to work on some "important" changes on the blogging software - just for the sake of demonstrating the ease of use and power of git. One interesting question came very early: "How many commands do we have to learn? It looks quite difficult at the moment" - Well, rest assured that during daily development circles you will need less than 10 git commands on a regular base: git add, commit, push, pull, checkout, and merge And Nayar demo'd all of them. Much to the delight of everyone he also showed gitk which is the git repository browser. It's an UI tool to display changes in a repository or a selected set of commits. This includes visualizing the commit graph, showing information related to each commit, and the files in the trees of each revision. Using gitk to display and browse information of a local git repository And last but not least, we took advantage of the internet connectivity and reached out to various online portals offering git hosting for free. Nayar showed us how to push the local repository into a remote system on github. Showing the web-based git browser and history handling, and then also explained and demo'd on how to connect to existing online repositories in order to get access to either your own source code or other people's open source projects. Next to github, we also spoke about bitbucket and gitlab as potential online platforms for your projects. Have a look at the conditions and details about their free service packages and what you can get additionally as a paying customer. Usually, you already get a lot of services for up to five users for free but there might be other important aspects that might have an impact on your decision. Anyways, moving git-based repositories between systems is a piece of cake, and changing online platforms is possible at any stage of your development. Visual Studio Online (VSO) - Jochen Well, Nayar literally covered all elements of working with git during his session, including the use of external online platforms. So, what would be the advantage of talking about Visual Studio Online (VSO)? First of all, VSO is "just another" online platform for hosting and managing git repositories on remote systems, equivalent to github, bitbucket, or any other web site. At the moment (of writing), Microsoft also provides a free package of up to five users / developers on a git repository but there is more in that package. Of course, it is related to software development on the Windows systems and the bonds are tightened towards the use of Visual Studio but out of experience you are absolutely not restricted to that. Connecting a Linux or Mac OS X machine with a git client or an integrated development environment (IDE) like Eclipse or Xcode works as smooth as expected. So, why should one opt in for VSO? Well, one of the main aspects that I would like to mention here is that VSO integrates the Application Life Cycle Methodology (ALM) of Microsoft in their platform. Meaning that you get agile project management with Backlogs, Sprints, Burn-down charts as well as the ability to track tasks, bug reports and work items next to collaborative team chats. It's the whole package of agile development you'll get. And, something I mentioned briefly during the begin of our meeting, VSO gives you the possibility of an automated continuous integrated (CI) process which builds and can run tests of your source code after each commit of changes. Having a proper CI strategy is also part of the Clean Code Developer practices - on Level Green actually -, and not only simplifies your life as a software developer but also reduces the sources of potential errors. Seamless integration and automated deployment between Microsoft Azure Web Sites and git repository But my favourite feature is the seamless continuous deployment to Microsoft Azure. Especially, while working on web projects it's absolutely astounishing that as soon as you commit your chances it just takes a couple of seconds until your modifications are deployed and available on your Azure-hosted web sites. Upcoming Events and networking Due to the adjusted times, everybody was kind of hungry and we didn't follow up on networking or upcoming events - very unfortunate to my opinion and this will have an impact on future planning of our meetups. Because I rather would like to see more conversations during and at the end of our meetings than everyone just packing their laptops, bags and accessories and rush off to grab some food. I was hoping to get some information regarding this year's Code Challenge - supposedly to be organised during July? Maybe someone could leave a comment on that - but I couldn't get any updates. Well, I'll keep digging... In case that you would like to get more into git and how to use it effectively, please check out Knowledge 7's upcoming course on "Effective git". Thanks Avinash for your vital input into today's conversation and I'm looking forward to get a grip on your book title very soon. My resume of the day Do not work in IT without any kind of version control system! Seriously, without a VCS in place you're doing it wrong. It's like driving a car without seat belts attached or riding your bike without safety helmet. You don't do that! End of discussion. ;-) Nowadays, having access to free (as in cost) tools to install on your machine and numerous online platforms to host your source code for free for up to five users it's a no-brainer to get yourself familiar with VCS. Today's sessions gave a good overview on how to start using git and how to connect to various remote services like github or VSO.

    Read the article

  • Benefits of PerformancePoint Services Using SharePoint Server 2010

    - by Wayne
    What is PerformancePoint Services? Most of the time it happens that the metrics that make up your key performance indicators are not simple values from a data source. In SharePoint Server 2007 PerformancePoint Services, you could create two kinds of KPI metrics: Simple single value metrics from any supported data source or Complex multiple value metrics from a single Analysis Services data source using MDX. Now things are even easier with Performance Point Services in SharePoint 2010. Let us check what is it? PerformancePoint Services in SharePoint Server 2010 is a performance management service that you can use to monitor and analyze your business. By providing flexible, easy-to-use tools for building dashboards, scorecards, reports, and key performance indicators (KPIs), PerformancePoint Services can help everyone across an organization make informed business decisions that align with companywide objectives and strategy. Scorecards, dashboards, and KPIs help drive accountability. Integrated analytics help employees move quickly from monitoring information to analyzing it and, when appropriate, sharing it throughout the organization. Prior to the addition of PerformancePoint Services to SharePoint Server, Microsoft Office PerformancePoint Server 2007 functioned as a standalone server. Now PerformancePoint functionality is available as an integrated part of the SharePoint Server Enterprise license, as is the case with Excel Services in Microsoft SharePoint Server 2010. The popular features of earlier versions of PerformancePoint Services are preserved along with numerous enhancements and additional functionality. New PerformancePoint Services features PerformancePoint Services now can utilize SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. Dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. New features and enhancements of SharePoint 2010 PerformancePoint Services • With PerformancePoint Services, functioning as a service in SharePoint Server, dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. The new architecture also takes advantage of SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. You also can include and link PerformancePoint Services Web Parts with other SharePoint Server Web Parts on the same page. The new architecture also streamlines security models that simplify access to report data. • The Decomposition Tree is a new visualization report type available in PerformancePoint Services. You can use it to quickly and visually break down higher-level data values from a multi-dimensional data set to understand the driving forces behind those values. The Decomposition Tree is available in scorecards and analytic reports and ultimately in dashboards. • You can access more detailed business information with improved scorecards. Scorecards have been enhanced to make it easy for you to drill down and quickly access more detailed information. PerformancePoint scorecards also offer more flexible layout options, dynamic hierarchies, and calculated KPI features. Using this enhanced functionality, you can now create custom metrics that use multiple data sources. You can also sort, filter, and view variances between actual and target values to help you identify concerns or risks. • Better Time Intelligence filtering capabilities that you can use to create and use dynamic time filters that are always up to date. Other improved filters improve the ability for dashboard users to quickly focus in on information that is most relevant. • Ability to include and link PerformancePoint Services Web Parts together with other PerformancePoint Services Web parts on the same page. • Easier to author and publish dashboard items by using Dashboard Designer. • SQL Server Analysis Services 2008 support. • Increased support for accessibility compliance in individual reports and scorecards. • The KPI Details report is a new report type that displays contextually relevant information about KPIs, metrics, rows, columns, and cells within a scorecard. The KPI Details report works as a Web part that links to a scorecard or individual KPI to show relevant metadata to the end user in SharePoint Server. This Web part can be added to PerformancePoint dashboards or any SharePoint Server page. • Create analytics reports to better understand underlying business forces behind the results. Analytic reports have been enhanced to support value filtering, new chart types, and server-based conditional formatting. To conclude, PerformancePoint Services, by becoming tightly integrated with SharePoint Server 2010, takes advantage of many enterprise-level SharePoint Server 2010 features. Unfortunately, SharePoint Foundation 2010 doesn’t include this feature. There are still many choices in SharePoint family of products that include SharePoint Server 2010, SharePoint Foundation, SharePoint Server 2007 and associated free SharePoint web parts and templates.

    Read the article

  • Brightness keeps changing in Windows 8.1 (on Macbook Pro Retina)

    - by gzak
    Before anyone gets too excited, it's not the "Adaptive Brightness" feature of the OS. I've already turned that off. Also it seems to have nothing to do with ambient light. It actually seems to do with the average "color" of the display. If I'm working in dark-themed Visual Studio, the brightness "pops" brighter. When I switch to the browser, it "pops" darker. So it's kind of adaptive brightness based on average pixel color (or something like that). What makes it rather annoying is that the brightness pops, rather than transitioning gradually. What is this feature, and how do I disable it (or at least make it smoother)?

    Read the article

  • Inverted function keys (F1-F12) on HP Pavilion dv4t

    - by The Electric Muffin
    (I know there have already been a lot of questions about this, but none of them mentioned the dv4t specifically.) I'm thinking about getting an HP Pavilion dv4t-4200 or -5100, but something that's really irritating me is that by default the function keys (F1-F12) are "inverted"—without holding the Fn key, the function keys do things like change the brightness, change or mute the volume, and switch to an external display. Only if you hold Fn will they actually produce F1, F2, etc. This is not how keyboards are supposed to work. Is there any way to disable this "feature" that has been verified to work on the HP Pavilion dv4t-4200 or HP Pavilion dv4t-5100? I don't want to buy this computer unless this is possible.

    Read the article

  • software package disappeared in GPO after access rights change

    - by sirka
    Hi, in GPO, this item Computer > Software settings > Assigned applications > IE8_package was on Security tab set "Authenticated users" to "Deny". After that it disappeared from GPO, it is there, but nowhere shown. The intention was to disable installation of that package for now, yet having other packages in that GPO still installed. I know it was stupid decision now. Is there any way to display that package back? Please help.

    Read the article

  • ADSIEdit freezes gettings properties of a group with hundred of thousands members

    - by ixe013
    Doing performance testing on an AD-LDS (Server 2008 R2 64 bits), we created a milion user in a single OU. We also created a single group object and made those milion users member of that group. When we try to list the milion of users ADSIEdit times out with an error message saying it cannot display that many users. Fine. But if we open the properties for the group, ADSIEdit freezes, eating up all available memory and CPU trashing (nearly 60M page faults in under an hour). AD-LDS (running on another computer) is barely hitting the 1% CPU mark, servicing other ldap requests as if nothing were. We can throw more memory at the problem, but more users will have to be managed one day and we will be back at square one. Is there a way to set a limit in ADSIEdit so that it will not hang the computer when retreving a very large multi-value object ?

    Read the article

  • Are there netcat-like tools for Windows which are not quarantined as malware?

    - by Matthew Murdoch
    I used to use netcat for Windows to help track down network connectivity issues. However these days my anti-virus software (Symantec - but I understand others display similar behaviour) quarantines netcat.exe as malware. Are there any alternative applications which provide at least the following functionality: can connect to an open TCP socket and send data to it which is typed on the console can open and listen on a TCP socket and print received data to the console ? I don't need the 'advanced' features (which are possibly the reason for the quarantining) such as port scanning or remote execution.

    Read the article

  • Creando controles personalizados para asp.net

    - by jaullo
    Si bien es cierto que asp.net contiene muchos controles que nos facilitan la vida, en muchas ocasiones requerimos funcionalidades adicionales. Una de las opciones es recurrir a la creación de controles personalizados. Este será el Primero de varios post que dedicare a mostrar como crear algunos controles personalizados utilizando elementos sumamente sencillos y faciles de entender. Para ello utilizaremos unicamente los regularexpressionvalidator y unas cuantas expresiones regulares. Para este ejemplo extenderemos la funcionalidad de un textbox para que valide números de tarjetas de crédito. Nuestro textbox deberá verificar que existan 16 números, en grupos de 4, separados por un - Entonces, creamos un nuevo proyecto de tipo control de servidor asp.net Primeramente importamos los espacios de nombres Imports System.ComponentModel Imports System.Web Imports System.Web.UI.WebControls Imports System.Web.UI   Segundo creamos nuestra clase Public Class TextboxCreditCardNumber end class Ahora,  le decimos a nuestra clase que vamos a heredar de textbox Public Class TextboxCreditCardNumber           Inherits TextBox end class Una vez que tenemos esto, nuestra base de programación esta lista, asi que vamos a codificar nuestra nueva funcionalidad Declaramos nuestra variables y una propiedad pública que contendrá el mensaje de error que debe ser devuelto al usuario, esta será publica para que pueda ser personalizada.    Private req As New RegularExpressionValidator     Private mstrmensaje As String = "Número de Tarjeta Invalido"     Public Property MensajeError() As String         Get             Return mstrmensaje         End Get         Set(ByVal value As String)             mstrmensaje = value         End Set     End Property   Ahora definimos el metodo OnInit de nuestro control, en el cual asignaremos las propiedad e inicializaremos nuestras funciones    Protected Overrides Sub OnInit(ByVal e As System.EventArgs)         req.ControlToValidate = MyBase.ID         req.ErrorMessage = mstrmensaje         req.Display = ValidatorDisplay.Dynamic         req.ValidationExpression = "^(\d{4}-){3}\d{4}$|^(\d{4} ){3}\d{4}$|^\d{16}$"         Controls.Add(New LiteralControl("&nbsp;"))         Controls.Add(req)         MyBase.OnInit(e)     End Sub   Y por último, definimos el evento render (que es el encarado de dibujar nuestro control) Protected Overrides Sub Render(ByVal writer As System.Web.UI.HtmlTextWriter)         MyBase.Render(writer)         req.RenderControl(writer)     End Sub   Lo unico que nos queda ahora es compilar nuestra clase y añadir nuestro nuevo control al ToolBox de Controles para que pueda ser utilizado.

    Read the article

  • How to add missing fonts to Adobe Illustrator?

    - by WilliamKF
    When opening an Adobe Acrobat PDF document on Mac OS X Lion to edit in Adobe Illustrator CS6, I got the message: The font Helvetica-Narrow-Bold is missing. Affected text will be displayed using a substitute font. The font QuickTypePi is missing. Affected text will be displayed using a substitute font. How can I provide the missing fonts so that no substitution occurs and upon return to Adobe Acrobat Pro X the original fonts will remain after any edits in Illustrator? Or, since the message talks about their display, will the font remain unchanged upon return to Adobe Acrobat?

    Read the article

  • "Breadcrumbs" for series of hostnames?

    - by Hamy
    Does anyone know of a shell that would show a series of breadcrumbs as I navigate into/out of various servers, like this: Home > Build Machine > Vagrant > Docker-base Hopefully it could auto-detect logging in and out of various boxes and display the hostnames. Perhaps with a simple "no circular links", one could just try and monitor the hostname, but I don't know if there is a shell that can easily act as a 'parent' to the other shells on these various systems so that it can query hostname and/or other item. Any thoughts?

    Read the article

  • Investigation: Can different combinations of components effect Dataflow performance?

    - by jamiet
    Introduction The Dataflow task is one of the core components (if not the core component) of SQL Server Integration Services (SSIS) and often the most misunderstood. This is not surprising, its an incredibly complicated beast and we’re abstracted away from that complexity via some boxes that go yellow red or green and that have some lines drawn between them. Example dataflow In this blog post I intend to look under that facade and get into some of the nuts and bolts of the Dataflow Task by investigating how the decisions we make when building our packages can affect performance. I will do this by comparing the performance of three dataflows that all have the same input, all produce the same output, but which all operate slightly differently by way of having different transformation components. I also want to use this blog post to challenge a common held opinion that I see perpetuated over and over again on the SSIS forum. That is, that people assume adding components to a dataflow will be detrimental to overall performance. Its not surprising that people think this –it is intuitive to think that more components means more work- however this is not a view that I share. I have always been of the opinion that there are many factors affecting dataflow duration and the number of components is actually one of the less important ones; having said that I have never proven that assertion and that is one reason for this investigation. I have actually seen evidence that some people think dataflow duration is simply a function of number of rows and number of components. I’ll happily call that one out as a myth even without any investigation!  The Setup I have a 2GB datafile which is a list of 4731904 (~4.7million) customer records with various attributes against them and it contains 2 columns that I am going to use for categorisation: [YearlyIncome] [BirthDate] The data file is a SSIS raw format file which I chose to use because it is the quickest way of getting data into a dataflow and given that I am testing the transformations, not the source or destination adapters, I want to minimise external influences as much as possible. In the test I will split the customers according to month of birth (12 of those) and whether or not their yearly income is above or below 50000 (2 of those); in other words I will be splitting them into 24 discrete categories and in order to do it I shall be using different combinations of SSIS’ Conditional Split and Derived Column transformation components. The 24 datapaths that occur will each input to a rowcount component, again because this is the least resource intensive means of terminating a datapath. The test is being carried out on a Dell XPS Studio laptop with a quad core (8 logical Procs) Intel Core i7 at 1.73GHz and Samsung SSD hard drive. Its running SQL Server 2008 R2 on Windows 7. The Variables Here are the three combinations of components that I am going to test:     One Conditional Split - A single Conditional Split component CSPL Split by Month of Birth and income category that will use expressions on [YearlyIncome] & [BirthDate] to send each row to one of 24 outputs. This next screenshot displays the expression logic in use: Derived Column & Conditional Split - A Derived Column component DER Income Category that adds a new column [IncomeCategory] which will contain one of two possible text values {“LessThan50000”,”GreaterThan50000”} and uses [YearlyIncome] to determine which value each row should get. A Conditional Split component CSPL Split by Month of Birth and Income Category then uses that new column in conjunction with [BirthDate] to determine which of the same 24 outputs to send each row to. Put more simply, I am separating the Conditional Split of #1 into a Derived Column and a Conditional Split. The next screenshots display the expression logic in use: DER Income Category         CSPL Split by Month of Birth and Income Category       Three Conditional Splits - A Conditional Split component that produces two outputs based on [YearlyIncome], one for each Income Category. Each of those outputs will go to a further Conditional Split that splits the input into 12 outputs, one for each month of birth (identical logic in each). In this case then I am separating the single Conditional Split of #1 into three Conditional Split components. The next screenshots display the expression logic in use: CSPL Split by Income Category         CSPL Split by Month of Birth 1& 2       Each of these combinations will provide an input to one of the 24 rowcount components, just the same as before. For illustration here is a screenshot of the dataflow containing three Conditional Split components: As you can these dataflows have a fair bit of work to do and remember that they’re doing that work for 4.7million rows. I will execute each dataflow 10 times and use the average for comparison. I foresee three possible outcomes: The dataflow containing just one Conditional Split (i.e. #1) will be quicker There is no significant difference between any of them One of the two dataflows containing multiple transformation components will be quicker Regardless of which of those outcomes come to pass we will have learnt something and that makes this an interesting test to carry out. Note that I will be executing the dataflows using dtexec.exe rather than hitting F5 within BIDS. The Results and Analysis The table below shows all of the executions, 10 for each dataflow. It also shows the average for each along with a standard deviation. All durations are in seconds. I’m pasting a screenshot because I frankly can’t be bothered with the faffing about needed to make a presentable HTML table. It is plain to see from the average that the dataflow containing three conditional splits is significantly faster, the other two taking 43% and 52% longer respectively. This seems strange though, right? Why does the dataflow containing the most components outperform the other two by such a big margin? The answer is actually quite logical when you put some thought into it and I’ll explain that below. Before progressing, a side note. The standard deviation for the “Three Conditional Splits” dataflow is orders of magnitude smaller – indicating that performance for this dataflow can be predicted with much greater confidence too. The Explanation I refer you to the screenshot above that shows how CSPL Split by Month of Birth and salary category in the first dataflow is setup. Observe that there is a case for each combination of Month Of Date and Income Category – 24 in total. These expressions get evaluated in the order that they appear and hence if we assume that Month of Date and Income Category are uniformly distributed in the dataset we can deduce that the expected number of expression evaluations for each row is 12.5 i.e. 1 (the minimum) + 24 (the maximum) divided by 2 = 12.5. Now take a look at the screenshots for the second dataflow. We are doing one expression evaluation in DER Income Category and we have the same 24 cases in CSPL Split by Month of Birth and Income Category as we had before, only the expression differs slightly. In this case then we have 1 + 12.5 = 13.5 expected evaluations for each row – that would account for the slightly longer average execution time for this dataflow. Now onto the third dataflow, the quick one. CSPL Split by Income Category does a maximum of 2 expression evaluations thus the expected number of evaluations per row is 1.5. CSPL Split by Month of Birth 1 & CSPL Split by Month of Birth 2 both have less work to do than the previous Conditional Split components because they only have 12 cases to test for thus the expected number of expression evaluations is 6.5 There are two of them so total expected number of expression evaluations for this dataflow is 6.5 + 6.5 + 1.5 = 14.5. 14.5 is still more than 12.5 & 13.5 though so why is the third dataflow so much quicker? Simple, the conditional expressions in the first two dataflows have two boolean predicates to evaluate – one for Income Category and one for Month of Birth; the expressions in the Conditional Split in the third dataflow however only have one predicate thus they are doing a lot less work. To sum up, the difference in execution times can be attributed to the difference between: MONTH(BirthDate) == 1 && YearlyIncome <= 50000 and MONTH(BirthDate) == 1 In the first two dataflows YearlyIncome <= 50000 gets evaluated an average of 12.5 times for every row whereas in the third dataflow it is evaluated once and once only. Multiply those 11.5 extra operations by 4.7million rows and you get a significant amount of extra CPU cycles – that’s where our duration difference comes from. The Wrap-up The obvious point here is that adding new components to a dataflow isn’t necessarily going to make it go any slower, moreover you may be able to achieve significant improvements by splitting logic over multiple components rather than one. Performance tuning is all about reducing the amount of work that needs to be done and that doesn’t necessarily mean use less components, indeed sometimes you may be able to reduce workload in ways that aren’t immediately obvious as I think I have proven here. Of course there are many variables in play here and your mileage will most definitely vary. I encourage you to download the package and see if you get similar results – let me know in the comments. The package contains all three dataflows plus a fourth dataflow that will create the 2GB raw file for you (you will also need the [AdventureWorksDW2008] sample database from which to source the data); simply disable all dataflows except the one you want to test before executing the package and remember, execute using dtexec, not within BIDS. If you want to explore dataflow performance tuning in more detail then here are some links you might want to check out: Inequality joins, Asynchronous transformations and Lookups Destination Adapter Comparison Don’t turn the dataflow into a cursor SSIS Dataflow – Designing for performance (webinar) Any comments? Let me know! @Jamiet

    Read the article

  • Is there a way to change the GTK theme for applications run as superuser on KDE?

    - by Patches
    When I run GTK applications on KDE, they use the QtCurve theme that matches my color and font scheme as configured in the KDE System Settings application. However GTK applications run as superuser use the old default GNOME, regardless of whether I run them with kdesudo, gksudo, or sudo on a terminal. For example, here's gedit run as superuser on top, and under my normal user account on the bottom: Strangely, KDE applications run with kdesudo display the default Oxygen styling but use my settings when run with sudo on a terminal. Is there any way to configure the stying GTK applications use when run as superuser on KDE?

    Read the article

  • Vi on Linux: show ^M line endings for DOS-format files

    - by sss
    On Solaris, if you open a file in vi that has Windows line endings, this shows up as ^M at the end of every line. On Linux, vi is cleverer and understands the Windows file format, and does not display ^M. Is there a setting to make Linux vi behave the same as Solaris in this respect? A common problem for us is copying a shell script off a (Windows) dev box and forgetting to dos2unix it, and then being confused when it doesn't work properly. On Solaris the problem is obvious as soon as you vi the file, but not on Linux. Thanks.

    Read the article

  • Two questions about restoring Thunderbird from a backup

    - by Eric
    Setting up a new Windows 7 PC, I'm puzzled by two things in Thunderbird 3.1.9: I restored a profile from a three-month old backup, no problem. I then copied more recent files into the Mail/ directory, but TBird still shows the old messages. The last message in Inbox is dated 3/16/2011 -- how do I get TBird to display all the messages in the Local Folders/Inbox view? A large number of the existing messages are now displayed in separate tabs -- I can't tell you how many, but there could be over 1000. Which file governs this? Or can I hire someone from Mechanical Turk to come over and manually close each tab?

    Read the article

  • Prevent Changing the Screen Saver and Wallpaper in Windows 7

    - by Mysticgeek
    Sometimes you might not want users to have the ability to change Screen Savers and Wallpaper on Windows 7 workstations. Today we look at how to prevent them from changing either one or both. You might administer computers in your home or small office and find it annoying when users continuously change the wallpaper and Screen Savers to something obnoxious. A lot of times they might be inexperienced users and download these so-called “wonderful and free” Screen Saver/Wallpaper packages from shady sites that include loads of Spyware. Preventing users from changing them is another helpful tool to avoid wasteful time spent switching things back. Prevent Changing Screensavers & Wallpaper Using Group Policy Editor  Note: This method uses Group Policy which is not available in Home versions on Windows 7. Open the Start Menu and enter gpedit.msc into the Search box and hit Enter. When Local Group Policy Editor opens, navigate to User Configuration \ Administrative Templates \ Control Panel \ Personalization. Then in the right column double-click on Prevent changing desktop background. Now check the radio button next to Enabled, then click OK. Back on the Group Policy Screen, double-click on Prevent changing screen saver. In the next screen select the radio button next to Enable, click OK, then close out of Group Policy Editor. Now when a user goes into the Personalization section, the Desktop Background hyperlink is now grayed out and inactive. Notice the message One or more of the settings on this page has been disabled by the system administrator at the bottom of the section. If they click to change the Screen Saver, an error message will pop up letting them know the function is disabled. Prevent Changing Screensavers & Wallpaper Using a Registry Hack You can also make a couple Registry changes to prevent users from changing the Wallpaper & Screen Saver…which will work on Home versions of Windows 7. Before making any Registry changes make sure you back it up first. Open the Registry by typing regedit into the Search box in the Start menu and hit Enter. First we’ll start with the Wallpaper. Navigate to HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Policies\System and create a new String Value and name it Wallpaper. Then modify the Value data to point to the location of the Wallpaper you want it to always be. Where in this example it’s our main wallpaper on our local drive…then click OK. Now let’s make sure they can’t change the Screen Saver. In the same Registry location, we need to make a new DWORD (32-bit) Value. Give it the Value name of NoDispScrSavPage and the value data of “1” and click OK. Close out of the Registry and restart the machine or simply log off then back on again for the changes to take effect. Results For the Wallpapers, a user can still go in and see the selections, however if they try to change it to something else… It will just go back to the Personalization screen and no changes will be made, as we set the value to only be the background we specified. If the user tries to make a change to the Screen Saver, the hyperlink will be grayed out and inactive, and the message One or more of the settings on this page has been disabled by the system administrator will be displayed at the bottom of the section. Conclusion If you’re tired of users changing the Wallpaper and Screen Saver, and want another way to help avoid Malware, locking down these settings can help a lot. Again, before making any changes to the Registry, make sure to back it up. These settings should work in Vista and XP as well. Similar Articles Productive Geek Tips Save 1-4% More Battery Life With Windows Vista Battery SaverCustomize Your Windows Vista Logon ScreenEnable "Ubuntu Style" Logons in Windows VistaManage the Delete Confirmation Dialog box in Windows 7Dual Monitors: Use a Different Wallpaper on Each Desktop TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows Fun with 47 charts and graphs Tomorrow is Mother’s Day Check the Average Speed of YouTube Videos You’ve Watched OutlookStatView Scans and Displays General Usage Statistics How to Add Exceptions to the Windows Firewall Office 2010 reviewed in depth by Ed Bott

    Read the article

< Previous Page | 516 517 518 519 520 521 522 523 524 525 526 527  | Next Page >