Collection of scripts

Recently I started contributing scripts to the community by uploading some of my scripts to the Technet Script Repository. Today I would like to highlight two of the scripts I have uploaded.

Delete files older than x-days – Cleanup Script

This script delete files older than x-days. The script is built to be used as a scheduled task, it automatically generates a logfile name based on the copy location and the current date/time. There are various levels of logging available and the script can also run in -listonly mode in which it only lists the files it would otherwise delete. There are two main routines, one to delete the files and a second routine that checks if there are any empty folders left that could be deleted.

PowerShell function to run as a different user

Script with both both the ability to set and get. When the Set switch is specified the script will prompt for credentials and write the password to the file file specified. When the script is running with the Get switch the script will read the password from the file specified in the $filename variable and use the username specified in the $username variable. This allows you to runas another identity without having to enter credentials.

For a complete listing of all scripts I have published please have a look at the Technet Gallery I have published other scripts there as well and I will be happy to answer any questions you have about them.

Share

10 thoughts on “Collection of scripts

  1. tboysen

    Hi,I\’m interested in using your \’Delete files older than x-days\’ cleanup script to clean up sql .bak files older than a certain age while retaining at least y files in the specified directory. I read through the switches in your script but I don\’t see a way to do that. Is it possible with your script?

    I don’t want to have the backup failing for longer than x-days and accidentally delete all of the backups.

    Reply
    1. Jaap Brasser Post author

      Currently this is not an option, because the script is intended to clean up large folder structures. What you could do is create a simple PowerShell script that calls this script that performs the check for you. For example a script with the following lines could perform the verification and the execution of the deleteold.ps1 script:

      if (Get-ChildItem -Path C:\Docs | Where-Object {$_.LastWriteTime -gt ((Get-Date).AddDays(7))}) {
      .\deleteold.ps1 -FolderPath C:\docs -FileAge 7 -logfile h:\log.log
      }

      It is possible that I might implement this in a future version of my script, but for now you can use this work-around.

      Reply
  2. Paul

    Hi Jaap First, thank you for the Script. It does clean up. I implemented in DEV, TEST, STAGE and they all seemed OK – running against all the Backup Folders of those Servers. When I went to PROD it also works, but there is 1 quirk: Whilst it is busy cleaning I open the Log file with TextPad. I see the header and the first couple of files it deleted. If you click somewhere in TextPad it asks you if you want to reload the file, because the file contents has changed. If you click Yes, it reloads the file, but now you are only left with the header and footer, but the files that were deleted are no longer listed there.Any ideas how this can be resolved? I need that info to keep the Auditors happy that we have a log.Kind thanks,Paul.

    Reply
    1. Jaap Brasser Post author

      Does this behavior only occur when you are using Textpad. Otherwise you might use the Get-Content Cmdlet for this purpose:
      Get-Content -Path C:\logfile.log -Tail 10
      This also allows you to get the last ten files the script is currently working on.

      Reply
  3. Henk Hildering

    Regarding “Delete files older than x-days – Cleanup Script” (deleteold.ps1 Version: 1.8.5)
    on line 773 it states
    if ($Auto:og) {$Switches+=”`r`n`t`t-AutoLog”}
    I assume it must say $Autolog instead og $Auto:og (the : is close to the l key).
    Nothing goes wrong as it is only use for the display of the used switches, but I thought I’d just let you know.

    Reply
    1. Jaap Brasser Post author

      Thanks Henk, there was indeed a typo in there, I have a version 1.9 that I am currently testing. Your correction has also been added into it. Thanks for taking the time to point that out to me!

      Reply
  4. herbiek

    Hi Jaap,Thanks for you great script that I started using a couple of days ago to keep our company temporary folder as clean as possible (keep only files and folders that are placed in that temporary folder of 1 week old). However I have an issue with your deleteold.ps1 script, which I\’m not sure is due to wrong usage or a bug?I also want to remove the directories that are empty after the file removal. Somehow this does not work as expected. I end up with a directory with more than 1000 empty directories. I also tried to use the -CleanFolders option (without any parameter) with the same result. Not sure, but I have the idea that the following could have something to do with this behavious? : Since some colleague\’s copy rather old files in our temporary folder, I use the \”-CreateTime\” option so that these files will not be deleted for 7 days despite their older age. Furthermore there are situations that complete directory trees (including empty directories) are copied into the temporary folder. If i read the description of the delete empty folder function literally the folders are emptied after all files in it are deleted. But what if the folder never contained any files, will folder this never be deleted?My deleteold.ps1 is dated 8-8-2013Do you have some suggestions for me?

    Reply
  5. Maneet Roy

    HI Brasser,

    Thank you for the wonderful script, it is working like a charm. I have a request if it can be done is when i run the script it should check the number of files also before deleting. For Example: if run the script for 60 days then it should go to each folder and return me the number of files and if the number of files is less 3 or 2 on each folder then it should not delete the files from that folder.

    Basically a location where my logs gets saved everymonth, however it is quite possible that it somes times does not save logs for the current month due to XYZ reasons, so when i run the script it should not delete the previous file if there are no files further again if there is file for the month of Dec’13 and we do not have file for the month of Jan’13 then it should not delete the Dec’13 file

    Can this be achieved waiting for your response.

    Reply
  6. Wayne Evans

    Hi Jaap

    I have been testing your deletold.ps1 script as a Scheduled task on some Windows Server 2008 R2 machines and it is working great!

    Just one question – can I call a parameters file from the script vice listing all the parameters in arguments in the scheduled task?

    The product I support has log files etc. all over C: and D: drives that i would like to manage. I could set up separate tasks for different “groups” of files to clean up but it would be easier to have one file with a bunch of include (and exclude) directories…

    One other question – If I use the -IncludeFolder is ONLY this folder(s) scanned or should I also use -ExcludeFolder?

    (Obviously I am a PowerShell Newbie…)

    Thanks, Wayne

    Reply
    1. Jaap Brasser Post author

      Hello Wayne,

      The way the script currently functions it will first scan all folders that are specified in the -FolderPath and then starts applying the specifications. The advantage of this is that we can use a very wide array of specifications of which folders to be affected. The disadvantage of this is that it significantly slows down for large collections of folders or entire drives for example. So if you have a wide array of folders I would recommend you call a batch file or PowerShell script file which sequentially launches the DeleteOld script against that particular folder, this will improve the speed with which the script is executed and allows you to determine where the log files for a particular folder are stored. You could for example decide that the logs of program a and program c are stored in the same log file while program b and program d each get their own log file.

      About the IncludeFolder and ExcludeFolder parameters, the way this works is that first a selection is made with the included folder after all included folders are process the script will process the excluded folders. If you have any specific questions about which parameters to use, feel free to post your current parameters and a short description of what you are trying to achieve and I will happily give you some pointers.

      So the short answer is: Yes it is possible to run this script against a large folder or drive but the performance will be slow. My advice for fragmented locations is to sequentially launch the script multiple times to shorten the execution time.

      Reply

Leave a Reply