Wednesday, October 28, 2009

MOSS 2007 Never Stops Crawling

I have come across the issue where our MOSS test farm never completed its initial crawl.  Whenever a search was done from a MySite, the dreaded, 'No results matching your search were found.'

The initial internet search I did, brought me in the direction of this wonderfully written troubleshooting blog about indexes and maintenance plans conflicting with each other, and locking up the indexes: http://blogs.vertigo.com/personal/michael/Blog/archive/2007/01/23/sharepoint-2007-search-never-stops-crawling.aspx

Our setup already had been configured correctly with the duplicate index allowance.  The solution we had was much more in front of us.

After battling through getting access to the shared services administration - I noted that the number of items crawled was at zero.  Since our setup was still pretty much virgin, with two site collections and a minimal amount of data entry with unique data, I thought the indexing should happen fairly quick…

After a few hours it was apparent something was not correct.  In the Configure Search Settings page, from within managing shared services - The indexing 'full crawl' was still in progress with no items yet in the index.  To resolve this I clicked in to the content sources and crawl schedules, and stopped any currently-running crawls.  Then I went back to Configure Search Settings.  From here I chose the 'Reset all crawled content' option.  Had this been a mature system, I would have had to use this with a bit more caution.

Once the 'Reset all crawled content' command completed, I initiated a new Full Crawl on the Content Sources and crawl schedules page.  Within a few minutes, the crawl had completed with a few thousand indexed items.  Even better, my searches were finally returning items!

Monday, October 26, 2009

The other day I was faced with the task of automating some IIS logs via an FTP scheduled task.
In order to do this, I relied on Rov Van Der Woude (www.robvanderwoude.com) and the following script to get me going:
http://www.robvanderwoude.com/batexamples_y.php
Before the subroutines kick-in on the file, I changed a couple settings to get the format that the IIS logging was setup for. I'll use W3C in this case (exyymmdd.log) where ex and .log remain static.
The portions in bold are what was added or changed:
:: Add leading zero to YesterD if necessary
IF %YesterD% LSS 10 SET YesterD=0%YesterD%
:: Yesterday's Year in YY format
set YesterY=%YesterY:~-2%
:: Yesterday's date in YYMMDD format
SET SortYest=%YesterY%%YesterM%%YesterD%
REM SET SortYest=%Yesterd%%YesterM%%YesterY%



:: Display the results
REM ECHO Format: DDMMYY (%LocalFormat%)
REM ECHO.==================================
REM ECHO Today: %SortDate% (%Today%)
CALL ECHO %SortYest%
:: Done
ENDLOCAL
GOTO:EOF
This was complimented by a few tweaks to the bat file, and the following guy: http://www.marijn.org/archive/2006/batch-files-variables
In particular, the step used is:
---------------------------
@FOR /F "tokens=*"%%i IN ('whateverYourFileIsCalledMaybeYesterday.bat') DO set yesterday=%%i
Echo %yesterday%
----------------------------
That echo is only necessary for testing. Once you are happy with the function of the script, then remove the echo line.
This was half the battle. I now have this wonderful script that is going to give me yesterday's date. Now I just need to feed this date in to the FTP script. Hrmmmm… I scratched my head at this for a bit, and got quite frustrated that FTP doesn't directly support batch variables. Alas, the work was not in vain. Below is the resulting file content of RunThisTask.bat that gets called by TaskScheduler each day:
--------------------------------------
@ECHO OFF
@FOR /F "tokens=*" %%i IN ('yest.bat') DO set yesterday=ex%%i.log
REM The above command takes the resulting variable created from yest.bat and puts it in the
REM format of the W3C IIS log naming convention.


echo %yesterday%
> temp.ftp ECHO open ftp.myFTPsiteToSaveLogsTo.com
>>temp.ftp ECHO username
>>temp.ftp ECHO password
>>temp.ftp ECHO lcd T:\YourIISWebLogDirectoryForOneWebsite
>>temp.ftp ECHO put %yesterday%
>>temp.ftp ECHO quit
REM The above creates a file temp.ftp in C:\ftp2govmetric (or where the batch file is called)
REM The following ftp logon, directory change, and put command transfer yesterday's log file
REM to the govmetric ftp site
ftp -s:temp.ftp
del temp.ftp
REM The above calls the created temp file that FTP reads in (FTP doesn't do variables directly)
REM After the FTP session completes, the temp file is then deleted, and the transfer scheduled task is complete
-------------------------------------
The first line after the ECHO is where the static parts of the filename get appended (ex and .log) to the calculated date. Of course, if the system time ever goes out of whack, then this script will fail miserably.
Security - There is none! this solution is using only the built-in windows 2003 technology to achieve this result. If the receiving server can handle SFTP then use that! Note the password you use with FTP is in plain text. You can lock-down the directory where these scripts are stored with NTFS permissions, which should suffice for this purpose as you need a user with admin rights to kick-off the Task Scheduler as well (so either the Administrators Group or the user running the task is who to whittle permissions down to).

Wednesday, October 15, 2008

Team Foundation Server 2005 Workgroup

What an aboslute beast of a child!

I now have the displeasure of installing this development environment. The nice thing about the installation is the installation guide. The steps involved are in my opinion, second-rate for a multi-billion dollar company.

The requirements are laid out fairly well. You need:

Server 2003 (check)
IIS 6 with ASP.NET (check)
SharePoint Services (doc didn't work for me here)
SQL 2005 (no SP worked for me, applying it afterwards)

The main caveat for anyone attempting to install, is to follow the guide, and utilize the different accounts for their roles, or you will run in problems while installing.

One feature I do like is how the pre-installation checks take place. These need to go much further though. For example, you need IIS to be freshly installed, that means no Virtual Server or any other website installed. Virtual Server 2005 R2 is at least kind enough in their installation steps to warn you of the potential issues installing on other than the default website will cause. Quite simply, Team Foundation web tier (unless you are an absolute whiz on IIS 6) demands sole existence.

The other issue I had was with the SharePoint installation. While it's easy enough to appreciate you must manually install sharepoint so as to not use MSDE, make sure you don't make any configuration changes on the default admin website or again, you will have to start from scratch. The same goes for the reporting services, don't touch it, no matter how much you think you may need to for permissions or whatever. The Team Foundation installation will handle all of it. What I found perplexing is while the default installation of Sharepoint Services 2.0 SP2 worked fine, when I tried to install Sharepoint Services 3.0 (and yes folks, 3.0 is listed in the installation manual), the Team Foundation installation demanded SharePoint Services 2.0 with SP2! This was the only descrepancy I found in an otherwise well-written installation document.

Once the installation completes with success, you are not given much direction as for what to do next. In order to now actually use this, one must have the Team Explorer application installed/added in to their existing Visual Studio 2005 setup. This is done by using the Team Foundation Server setup program, and selecting that option.

After the installation completes the first step is to add users/groups to the foundation server, using the account you installed with, on the foundation server. Open Visual Studio, then click on the newly created Team menu item. From there you navigate to Team Foundation Studio Configuration, then go to groups. The interface here is pretty self-explanatory. You add the users/groups for permissions to the default security groups or create your own. All should then work, right? Wrong. For users to use Team Foundation Workgroup (standard requires purchase of its own license), each user must be individually added to the Licenses group. That original user for installation will appear here (that's who you're logged in to get to this point). Once you add at least one other administrator, you can then log out, use the new administrator, then remove the installation account to make use of all five licensed connections.

Once the installation finally goes ahead, everything runs fairly smooth. In conclusion, I found the installation to have some very positive areas, and some areas that could build upon the Virtual Server installation scenario. It was a bit like having a family member refuse to sit and eat at the dinner table because the plates/forks/etc were not set to their liking.

Tuesday, October 14, 2008

Virtual PC Time Synchronization

Today I received a question on how to disable the VM time synch with its host. The answer for this is not as straight-forward as it is with VMWare.

http://blogs.msdn.com/virtual_pc_guy/archive/2007/11/28/disabling-time-synchronization-under-virtual-pc-2007.aspx

In short, adjust the .vmrc xml file to include the following tags (under mouse is fine)



false



At least it is in XML

Monday, September 22, 2008

Remotely Logging Off Terminal Service Users

This guy has a contrite posting on the matter:

http://www.danrigsby.com/blog/index.php/2008/08/26/remotely-log-off-remote-desktop-users/

Tuesday, September 16, 2008

Ready, IPSEC, No Communication

That was the problem I had recently experienced with the application of updates MS09-045 and MS08-049 in tandem. The machine this took place on also doubles as an EndPoint Security Server (which invites its own slew of issues).

The handy work of Ant Drewery at http://www.drewery.net/blog/2006/04/18/failure-of-ipsec-services/#comment-40309 helped send me in the right direction. Looks like petri may have some competition!

The issue occured after trying to restart the machine after installing updates. For whatever reason, the system just did not want to bounce. Despite about 3 attempted restart /f commands. Upon it coming back up, there was no network connectivity to the outside world. IPSEC had ceased all that communication quite nicely.

To resolve - I re-registered a dll with the following command:

regsvr32 polestor.dll

Upon restart all works nicely! This is on a 2003 SP2 client with a 2003 SP2 Domain Controler.

Tuesday, August 26, 2008

More IIS 6.0 Fun

Several months ago, I set up a new network environment that was not live at the time.  That environment, once up and running and ready, was then utilized by some outside consultants and developers as an ad-hoc testbed.  Ultimately, The work carried out was not entirely (I am starting to doubt this every happens) removed or cleared up, so the system was back in its original state.  Luckily (or smartly), a complete backup of the system state, the IIS metabase, and the IIS configuration and directoy structures were made before access was given.  This allowed me two options:
 
1) Figure out what got changed and how to undo those changes
2) Restore the system settings, configuration, metabase, and directory structure.
 
Time not being of the essence, I chose the former.
 
When websites utilize an anonymouse user, that user can either be the default (IWAM?) user, or any other user you choose to specify in either a domain or local machine.  When this happens, or if any other site on IIS, or Virtual site utilize a non-default anonymous user, this poses frustrating authentication problems.  There are two ways (maybe three, but I didn't have any luck with the adsutil.vbs approach of synchronizing the passwords) that I was able to resynch the ever-changing anonymous user password hash.  The first is to route out all descrpencies for each and every incarnation of the mismatch for the user in the metabase (Make sure you are able to edit the metabase directly if you want to try this dicey approach.
 
The more-straight-forward approach that is much much safer to use is to disable all authentication for each website and virtual website.  Once that is done, you can go back and start enabling the anonymous authentication.  What is happening underneath the hood, is you are allowing IIS to automagically handle all the password hashes on the metabase by removing them.  Then as you re-enable this for each website or Virtual website, IIS will automatically ensure the passwords are synchronized when you select the other sites in the Apply Changes Prompt.
 
What causes them to get out of synch!?
 
This, thus far in my experience is not terribly easy to explain.  From what I can gather in the limited testing I have done, lets say you have siteA and virtualSiteA.  If you disable the synchronization on siteA then later re-enable it, you need to select any and all virtual sites when prompted to apply the authentication changes to.  If this doesn't happen, then oldest password hash (now virtualSiteA) is going to have the only legitamite hash to authenticate against, and the rest of your sites will not operate the way you want/need them to.
 
So, before you go out and start messing with Application Pools, or deleting and recreating users, try the above (no warranties!  Use entirely at your own Risk!).

SQL Data Types

So now that I'm studying for my 70-431 exam fully, I am writing out the datatypes:
 
  • Exact Numeric (Used for precision data)
  • Approximate Numeric (Used rarely, and appears to me to be a hangover from C)
  • DateTime
  • Money
  • Character
  • Binary  (this includes the image type most likely usedin Sharepoint for storing documents)
  • Specialized (identity for primary keys, GUIDs, are two examples here)
Constraints:
  • Check Constraints (These are set on tables)
  • Rules (These are being phased out if not already in 2008, do not use), can be used outside of a table
  • Unique Constraints (Prevents duplicate values in columns)
  • Default Constraints
  • Primary Key
  • Foreign Key (this constraint ensures the value exists in another table already)

 

Friday, August 15, 2008

Scheduled Tasks are a Sensitive thing!

The other day after restriction of service accounts on a network, I was faced with troubleshooting why some scheduled tasks stopped running.  The accounts that ran the tasks were no longer part of the administrators group on the local machine in an Active Directory environment.  Not even Power User or Serer Operator group will work for you here, and we are referring to Server 2003 +
 
I believe the best practice is to create a domain user, and then incorporate that account in to each local machine's local Administrators Group as necessary to perform its tasks.

Thursday, August 14, 2008

Network Monitor and the SSL Handshake with DNS

Recently I found myself migrating a system to a new data centre. During that testing a user realized that their process for submitting data was not working.

Despite the website working, the submission to an ASP link over HTTPS kept returning errors that the specified file cannot be found.

After quite a bit of time, the root cause was addressed down to DNS. I missed an early-on fundamental question of whether or not they utilize the hosts file on their system they were submitting information with. As it turns out, there was a typo in the domain name entered in to the hosts file!

So what ultimately what led us back to that (It was reassured previously this was not the problem and the link was fine) was the use of Network Monitor to watch the SSL handshake.

The following link provided a nice rundown showing the mechanisms on a Windows Server:
http://support.microsoft.com/kb/257587

It was in the packet details that I realized that not only was the site certificate being sent, but it was also sending the parent Certificate Authority also. The sending of the parent CA only happens when the initial website SSL certificate does not get trusted by the connecting client. This can happen for any number of reasons, like out of date revocation lists, CA updates not applied, or, in my experience here, a typo in the LM Hosts file.

So, wait a minute, why didn't it just use DNS? Well, the specification of DNS in 1034/1035 (Forget which) has a DNS client follow a very specific method of resolving a name to an IP address.

1) Client checks the local hosts file
2) Client checks its local memory cache
3) Client queries a DNS server (a second one if no response)
(client may try to query a root server at this also depending configuration and definition of a DNS client)
4) Client waits for a response

So why does the hosts file get looked at first and not the local cache? This occurs because the history of DNS. Back before the hierarchial querying was established, ALL of the internet DNS was held in a hosts file. Once this became unmanageable, DNS in its current incarnation (more or less) was formed.

Friday, January 04, 2008

WSUS 3.0 and .NET 2.0 SP1

Today after installing patches to the WSUS server, it stopped working after the restart.

What I should have done, is take a nice backup of the System state before installing the patches!

In any case, upon restart the machine came up, but the following types of errors came up:

I had to set the Web Publishing service to NOT allow desktop interaction

ASP.NET 2.0.50727.0 EventId 1314 appeared in the event log with the following error:

Even code: 4011
Event message: An unhandled access exception has occurred.
..... etc.

I got to the point where I had no authentiation mechanism, eventhough all services were running. No users in the WSUS local groups... strange...

So, rather than removing the patches, I have uninstalled WSUS and reinstalled (leaving all the information in-tact, db, log and one other option I can't remember). The problem with this is that I will still most likely have the same underlying issue and need to manually uninstall the updates I have applied before the installation.

Upon second attempt at reinstallation... it worked.

Friday, December 28, 2007

Migrating over from a 2000 Domain Controller to a 2003 Domain Controller

http://technet2.microsoft.com/windowsserver/en/library/99f53498-ce25-4ab4-b476-7aa6e1997d641033.msp x?mfr=true

http://support.microsoft.com/kb/555549

Tuesday, October 16, 2007

Control your data on a 2003 file server

I came across this article at Tech Republic and am impressed.  I will be implementing this shortly in a network and follow-up with my findings.
 
 
The ability to restrict not just mp3s, but perhaps make it so only certain types of file can be saved in certain locations is what I am really after.

Run Internet Explorer on *nix

Hmm this could be an interesting way to mitigate MS saturation in to a network.  The downside is that the legal use of it still requires an MS OS license.
 

Monday, October 15, 2007

IIS 6.0 Nuances

Today while rearranging some IIS websites I came across a particularly frustrating challenge.
 
The work that was carried out involved adding an additional IP (and NIC) to our web server.  Host headers were not an option because the sites use SSL.  After applying all the necessary NIC and firewall settings, the pre-existing website, when assigned to this new IP address, would not start.  It came back with the message stating that there was a potential conflict using the same ports.
 
Interestingly, I was able to create a new site which would start.  I created this site with the ip from the setup wizard.  So what seems to have happened here is the metabase did not get updated appropriately for the pre-existing site.  rather than utilise a tool such as httpconfig in this situation, it was far quicker to simply create the new site.  If the websites were many, or were part of a cluster, then using the httpconfig windows support tool would have almost been needed.

Thursday, October 11, 2007

Firefox and NTLM Authentication

A network I support has a number of Firefox users.  Recently after strengthening their GP settings to enhance security, Firefox users started to experience issues logging in to IIS servers that use intregrated authentication.  This is because by default Digest Authentication is used by FireFox.  In order to enable NTLM authentication in Firefox, I used the information in this link:  http://www.cauldwell.net/patrick/blog/PermaLink,guid,c7f1e799-c4ae-4758-9de7-5c3e7a16f3da.aspx
 
Basically, network.automatic-ntlm-auth.trusted-uris setting in FireFox needs to have the servers listed for which NTLM authentication will be required.
 
If Proxies are required, this setting will be of use:  network.automatic-ntlm-auth.allow-proxies
 
For those wanting a more original source of information, please see the mozilla knowledge base:  http://kb.mozillazine.org/Network.automatic-ntlm-auth.trusted-uris

Monday, October 08, 2007

The Powershell

On of the nice things I have found useful with the powershell, is the resiliency it provides over a command prompt.
 
A while ago I was copying some directory structures that have some archaic deny permissions set here and there.  In a scramble (You can tell this environment is organized, can't you?) to copy the folder structure and files, I did something like:
 
xcopy c:\FolderParent\* e:\NewFolder\ /E
 
Unfortuneately, at the first access-denied problem, the copy process gets aborted.  Nor does any information get displayed about where the copy stopped.
 
The Powershell allows you to do something like:
 
copy-item c:\FolderParent\ e:\NewFolder\ -recurse
 
This will output any errors in red (by default) but continue on copying the rest of the data (which is listed by default).  You can also tack on the -exclude command to leave out files AND folders that have the specified string (like *.txt).
 
Now if I take the time, I should be able to doctor up a nice master script that will output a complete log, and an estimated time to copy x GBs !

The Glory of a UPS

Since I have last posted I have passed my 70-290 exam.

I am now working on the 70-291 and CCNA 60-801 exams. The 70-291 seems to be mostly about technical knowledge and the use of Network Monitor.

Back when I passed the 70-290 I had registered for the CCNA-INTRO exam. I missed a PASS by about 2 questions (12 points or something) which was a bit of a bummer. So, now I am going to take the CCNA full-on in the next few weeks, as the exam completely changes in early November 2007 and I REALLY do not want to lose any study hours I have put in thus far!

So last week I migrated a box that had the role of file server. This basically meant that most of the company's important stuff was there, and everything else scattered about where there is space. In the throws of a last-minute rushed migration, I hooked up one of our nice new APC UPS's. Coincidentally the following week (This Monday morning!) the local power company decides to yo-yo the grid for the entire morning. The bonus? Our UPS works wonderfully. The drawback, everybody still experienced down-time at their workstations.

It has been a late night of studying for the CCNA. Now I am off to bed and ready to take on tomorrows onslaught of machinistic demands :-) Now if I could just learn how to master a PIX in a couple hours... That would bode well for my network overhaul project!

Thursday, June 07, 2007

Virtualization Options

The below link is a nice starting point for finding out more about virtualization:

 

http://www.petri.co.il/virtual_virtualization_options_compared.htm

FindString Performance

Today I have been tasked with finding out if a particular string exists in a very large amount of data (18GB) that is not indexed.  The files are preparation files before being loaded in to a database.  The real shortcoming I find with findstr compared to grep is that findstr does not allow you to search select subdirectories without creating a batch file to script the work.  With grep, this can be done from the cli.