When managing AWS EC2 instances, the default approach of using a single keypair for SSH authentication creates significant operational challenges. The fundamental issues include:
- No granular access control (all-or-nothing permission model)
- Security risks when team members leave the project
- Key rotation difficulties requiring full team coordination
- No audit trail for individual user actions
The most effective method is managing multiple public keys in the ~/.ssh/authorized_keys
file. Here's the technical implementation:
# On the EC2 instance:
mkdir -p ~/.ssh
chmod 700 ~/.ssh
touch ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys
1. Collecting Team Members' Public Keys
Each team member should generate their own keypair:
ssh-keygen -t rsa -b 4096 -f ~/.ssh/dev_team_member1
2. Appending Keys to Authorized Keys
Append each public key to the authorized_keys file with appropriate comments:
# User: alice@team (Expires: 2024-12-31)
ssh-rsa AAAAB3NzaC... alice_public_key
# User: bob@team (Expires: 2024-06-30)
ssh-rsa AAAAB3NzaC... bob_public_key
Command Restrictions
Limit what specific keys can execute:
command="/usr/bin/monitoring-script",no-port-forwarding,no-X11-forwarding ssh-rsa AAAAB3NzaC... restricted_key
IAM Integration (for AWS environments)
For EC2 instances with IAM roles, consider using AWS Systems Manager Session Manager as an alternative:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ssm:StartSession"
],
"Resource": "arn:aws:ec2:region:account-id:instance/instance-id"
}
]
}
- Implement key expiration dates in comments
- Use configuration management tools to automate key updates
- Consider temporary credentials for contractors
- Regularly audit active keys (quarterly recommended)
When SSH access fails after adding new keys:
# Check permissions:
ls -la ~/.ssh/
# Verify SSH daemon configuration:
sudo grep AuthorizedKeysFile /etc/ssh/sshd_config
# Test authentication with verbose output:
ssh -i ~/.ssh/private_key -v user@ec2-instance
When working with AWS EC2 instances in team environments, the single-key-per-instance approach creates significant operational headaches. The fundamental issues include:
- No granular access control - all users share identical permissions
- No audit trail for individual user actions
- Key rotation requires distributing new keys to everyone
- No ability to revoke individual access without affecting others
The proper solution involves managing multiple public keys in the ~/.ssh/authorized_keys
file on your EC2 instance. Here's how it works:
# Sample authorized_keys file format
ssh-rsa AAAAB3NzaC... user1@workstation
ssh-rsa AAAAB3NzaC... user2@laptop
ssh-ed25519 AAAAC3N... user3@desktop
1. Generate individual key pairs:
# For each team member
ssh-keygen -t rsa -b 4096 -f ~/.ssh/teamuser1_ec2 -C "teamuser1@company.com"
2. Distribute public keys to instance:
# Using AWS Systems Manager
aws ssm send-command \
--instance-ids i-1234567890abcdef0 \
--document-name "AWS-RunShellScript" \
--parameters 'commands=["echo ssh-rsa AAAAB3NzaC... >> ~/.ssh/authorized_keys"]'
For larger teams, consider these professional approaches:
- IAM-based SSH access: Use EC2 Instance Connect with IAM policies
- SSH Certificate Authority: Centralized key signing with short-lived certificates
- Configuration management: Ansible/Chef/Puppet to manage authorized_keys
Always follow these guidelines:
# Set proper permissions
chmod 700 ~/.ssh
chmod 600 ~/.ssh/authorized_keys
# Regular key rotation
#!/bin/bash
# Rotate keys monthly
0 0 1 * * /usr/bin/rotate_ssh_keys.sh
If SSH access fails after adding keys:
- Verify key file permissions (600 for private keys)
- Check sshd_config for
AuthorizedKeysFile
setting - Confirm SELinux/apparmor isn't blocking access
- Examine
/var/log/secure
for authentication errors