Apr 17 07:48:54.462440 ip-10-0-133-228 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:48:54.462450 ip-10-0-133-228 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:48:54.462457 ip-10-0-133-228 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:48:54.462701 ip-10-0-133-228 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:49:04.520780 ip-10-0-133-228 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:49:04.520796 ip-10-0-133-228 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1b3d19f2c8ee4e25874737b1cd6143a6 -- Apr 17 07:51:27.393603 ip-10-0-133-228 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:51:27.823112 ip-10-0-133-228 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:27.823112 ip-10-0-133-228 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:51:27.823112 ip-10-0-133-228 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:27.823112 ip-10-0-133-228 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:51:27.823112 ip-10-0-133-228 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:27.824737 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.824632 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:51:27.827757 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827737 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:27.827757 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827758 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827762 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827767 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827772 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827775 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827784 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827788 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827791 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827794 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827797 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827800 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827803 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827806 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827809 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827812 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827814 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827817 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827819 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827822 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:27.827825 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827824 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827827 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827830 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827833 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827836 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827839 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827841 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827844 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827847 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827850 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827853 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827855 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827858 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827860 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827863 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827866 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827869 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827871 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827874 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827877 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:27.828292 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827879 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827882 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827885 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827888 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827890 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827893 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827895 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827898 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827900 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827902 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827905 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827907 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827910 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827912 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827916 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827918 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827921 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827923 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827926 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827929 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:27.828818 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827931 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827934 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827936 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827939 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827941 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827944 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827947 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827950 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827952 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827955 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827957 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827960 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827962 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827965 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827968 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827973 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827976 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827978 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827982 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827984 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:27.829327 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827986 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827989 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827991 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827994 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827996 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.827999 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828437 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828442 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828445 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828448 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828451 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828454 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828456 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828459 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828461 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828464 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828467 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828470 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828472 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:27.829839 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828475 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828478 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828481 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828485 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828488 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828493 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828496 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828500 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828502 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828505 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828508 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828511 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828514 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828517 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828521 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828523 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828526 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828529 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828532 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:27.830314 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828536 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828539 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828542 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828545 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828548 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828550 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828553 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828555 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828558 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828561 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828563 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828566 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828568 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828571 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828573 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828576 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828579 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828581 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828584 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828587 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828590 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:27.830782 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828593 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828595 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828597 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828600 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828602 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828605 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828607 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828609 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828612 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828615 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828617 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828622 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828624 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828627 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828629 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828632 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828635 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828637 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828640 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828643 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:27.831296 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828645 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828648 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828651 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828654 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828666 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828669 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828671 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828674 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828677 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828679 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828682 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828686 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.828689 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829474 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829483 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829492 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829496 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829500 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829504 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829509 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829514 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:51:27.831769 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829517 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829521 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829525 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829528 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829531 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829535 2565 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829538 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829541 2565 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829544 2565 flags.go:64] FLAG: --cloud-config="" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829547 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829550 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829558 2565 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829561 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829564 2565 flags.go:64] FLAG: --config-dir="" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829567 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829570 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829574 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829577 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829580 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829584 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829587 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829590 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829593 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829602 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829605 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:51:27.832382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829610 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829613 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829616 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829619 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829623 2565 flags.go:64] FLAG: --enable-server="true" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829626 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829633 2565 flags.go:64] FLAG: --event-burst="100" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829636 2565 flags.go:64] FLAG: --event-qps="50" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829639 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829642 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829645 2565 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829649 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829652 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829655 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829658 2565 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829661 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829664 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829667 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829671 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829674 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829677 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829679 2565 flags.go:64] FLAG: --feature-gates="" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829684 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829687 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829690 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:51:27.833001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829694 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829697 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829700 2565 flags.go:64] FLAG: --help="false" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829703 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-133-228.ec2.internal" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829707 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829710 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829713 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829717 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829721 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829724 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829727 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829730 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829733 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829736 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829739 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829742 2565 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829745 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829748 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829752 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829755 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829758 2565 flags.go:64] FLAG: --lock-file="" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829761 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829764 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829767 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:51:27.833632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829773 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829776 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829778 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829781 2565 flags.go:64] FLAG: --logging-format="text" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829784 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829788 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829790 2565 flags.go:64] FLAG: --manifest-url="" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829793 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829798 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829801 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829805 2565 flags.go:64] FLAG: --max-pods="110" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829808 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829811 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829814 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829817 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829826 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829829 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829832 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829840 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829844 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829847 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829853 2565 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829856 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:51:27.834231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829862 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829865 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829869 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829872 2565 flags.go:64] FLAG: --port="10250" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829875 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829878 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-046d3c1c72d9536a6" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829881 2565 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829884 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829887 2565 flags.go:64] FLAG: --register-node="true" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829890 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829893 2565 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829896 2565 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829899 2565 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829902 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829905 2565 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829908 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829912 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829915 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829917 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829920 2565 flags.go:64] FLAG: --runonce="false" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829923 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829926 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829929 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829932 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829936 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829939 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:51:27.834843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829942 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829946 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829948 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829951 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829954 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829958 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829961 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829964 2565 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829967 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829972 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829975 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829979 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829985 2565 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829988 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829991 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829994 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.829997 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.830001 2565 flags.go:64] FLAG: --v="2" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.830005 2565 flags.go:64] FLAG: --version="false" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.830009 2565 flags.go:64] FLAG: --vmodule="" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.830014 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.830017 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830126 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830131 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:27.835477 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830134 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830138 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830141 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830144 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830147 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830149 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830153 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830156 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830165 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830169 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830172 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830174 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830177 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830181 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830184 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830187 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830189 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830192 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830194 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830197 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:27.836059 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830200 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830203 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830205 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830222 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830225 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830227 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830231 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830236 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830238 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830241 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830244 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830247 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830250 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830252 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830255 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830258 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830260 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830263 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830267 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830269 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:27.836584 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830272 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830276 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830279 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830281 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830284 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830288 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830290 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830293 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830296 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830298 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830301 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830304 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830306 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830309 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830312 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830315 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830317 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830320 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830324 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830326 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:27.837135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830329 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830332 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830334 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830337 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830339 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830342 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830344 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830347 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830350 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830352 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830355 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830357 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830360 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830363 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830366 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830368 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830371 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830374 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830377 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:27.837656 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830379 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:27.838125 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830382 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:27.838125 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830385 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:27.838125 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830387 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:27.838125 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.830390 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:27.838125 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.831600 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:27.839236 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.839196 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:51:27.839276 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.839244 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:51:27.840793 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.839473 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:27.840793 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840790 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:27.840793 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840796 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:27.840793 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840800 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840808 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840812 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840816 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840821 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840826 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840831 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840834 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840837 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840843 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840848 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840852 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840855 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840864 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840867 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840870 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840978 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840983 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840986 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:27.841008 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840989 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840992 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840996 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.840999 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841001 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841004 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841006 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841009 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841012 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841015 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841018 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841021 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841025 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841027 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841030 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841033 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841035 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841038 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841041 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:27.841517 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841043 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841046 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841048 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841051 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841055 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841057 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841060 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841062 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841065 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841068 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841071 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841073 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841076 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841079 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841081 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841085 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841088 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841091 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841093 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841097 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:27.841984 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841101 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841103 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841106 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841109 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841112 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841115 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841118 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841120 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841123 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841125 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841128 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841132 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841137 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841140 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841144 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841147 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841150 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841155 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841159 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841162 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:27.842504 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841164 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841167 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841170 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841173 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841175 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.841182 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841328 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841335 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841339 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841342 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841346 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841351 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841354 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841357 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841360 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:27.843009 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841363 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841366 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841369 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841371 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841374 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841376 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841380 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841382 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841385 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841387 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841390 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841393 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841396 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841398 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841401 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841407 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841410 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841413 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841416 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841419 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:27.843406 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841421 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841424 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841427 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841429 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841432 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841435 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841438 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841442 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841444 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841447 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841450 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841452 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841455 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841457 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841460 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841463 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841466 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841469 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841472 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841475 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:27.843899 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841477 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841480 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841482 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841485 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841487 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841490 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841492 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841495 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841498 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841500 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841503 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841505 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841508 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841510 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841513 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841515 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841518 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841521 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841524 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841527 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:27.844455 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841531 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841535 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841538 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841540 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841543 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841546 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841548 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841551 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841555 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841558 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841560 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841563 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841565 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841568 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841570 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841573 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:27.844954 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:27.841575 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:27.845366 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.841580 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:27.845366 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.842333 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:51:27.845521 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.845506 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:51:27.846667 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.846654 2565 server.go:1019] "Starting client certificate rotation" Apr 17 07:51:27.846776 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.846757 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:27.846833 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.846812 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:27.870437 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.870409 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:27.875088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.875050 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:27.892293 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.892268 2565 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:51:27.898186 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.898150 2565 log.go:25] "Validated CRI v1 image API" Apr 17 07:51:27.899653 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.899630 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:51:27.901377 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.901360 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:27.904430 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.904397 2565 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7d69ffa8-ae8b-4b75-9902-bbe99a6a2248:/dev/nvme0n1p4 ca189e7b-514c-4527-824b-ccb339128b04:/dev/nvme0n1p3] Apr 17 07:51:27.904536 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.904426 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:51:27.909857 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.909730 2565 manager.go:217] Machine: {Timestamp:2026-04-17 07:51:27.908055052 +0000 UTC m=+0.396936195 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099347 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25fb1672b17e2ec487563edf261308 SystemUUID:ec25fb16-72b1-7e2e-c487-563edf261308 BootID:1b3d19f2-c8ee-4e25-8747-37b1cd6143a6 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2f:c8:bd:ba:ab Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2f:c8:bd:ba:ab Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:12:09:fb:5f:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:51:27.909857 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.909843 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:51:27.910042 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.910028 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:51:27.911471 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.911447 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:51:27.911720 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.911474 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-228.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:51:27.911812 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.911781 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:51:27.911812 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.911800 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:51:27.911921 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.911895 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:27.912768 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.912744 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:27.913607 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.913591 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:27.913730 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.913720 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:51:27.916082 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.916071 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:51:27.916116 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.916087 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:51:27.916116 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.916100 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:51:27.916116 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.916110 2565 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:51:27.916241 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.916120 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:51:27.917252 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.917233 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2fcr" Apr 17 07:51:27.917624 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.917575 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:27.917624 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.917595 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:27.920760 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.920743 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:51:27.922278 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.922260 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:51:27.924543 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924528 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924551 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924560 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924567 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924577 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924586 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924595 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924604 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:51:27.924618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924614 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:51:27.924879 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924624 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:51:27.924879 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924638 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:51:27.924879 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.924658 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:51:27.925555 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.925544 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:51:27.925608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.925559 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:51:27.927064 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:27.927029 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-228.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:51:27.927147 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:27.927036 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:51:27.927915 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.927893 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2fcr" Apr 17 07:51:27.930232 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.930199 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:51:27.930320 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.930265 2565 server.go:1295] "Started kubelet" Apr 17 07:51:27.930370 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.930345 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:51:27.930834 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.930780 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:51:27.930879 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.930860 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:51:27.931252 ip-10-0-133-228 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:51:27.932238 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.932074 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:51:27.933182 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.933168 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:51:27.937427 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.937406 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:51:27.937528 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.937434 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:27.938295 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938273 2565 factory.go:55] Registering systemd factory Apr 17 07:51:27.938396 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938304 2565 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:51:27.938396 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938278 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:51:27.938396 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938363 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:51:27.938531 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938281 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:51:27.938587 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938545 2565 factory.go:153] Registering CRI-O factory Apr 17 07:51:27.938587 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938559 2565 factory.go:223] Registration of the crio container factory successfully Apr 17 07:51:27.938587 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938576 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:51:27.938587 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938587 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:51:27.938757 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:27.938570 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:27.938757 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938618 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:51:27.938757 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938643 2565 factory.go:103] Registering Raw factory Apr 17 07:51:27.938757 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.938654 2565 manager.go:1196] Started watching for new ooms in manager Apr 17 07:51:27.939242 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.939205 2565 manager.go:319] Starting recovery of all containers Apr 17 07:51:27.940697 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.940300 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:27.943723 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.943701 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-228.ec2.internal" not found Apr 17 07:51:27.943840 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:27.943785 2565 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-228.ec2.internal\" not found" node="ip-10-0-133-228.ec2.internal" Apr 17 07:51:27.951678 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.951658 2565 manager.go:324] Recovery completed Apr 17 07:51:27.953318 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:27.953295 2565 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 07:51:27.956382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.956367 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:27.958847 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.958829 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:27.958921 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.958861 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:27.958921 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.958872 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:27.959368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.959353 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:51:27.959368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.959366 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:51:27.959456 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.959387 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:27.960305 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.960291 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-228.ec2.internal" not found Apr 17 07:51:27.961582 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.961569 2565 policy_none.go:49] "None policy: Start" Apr 17 07:51:27.961627 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.961586 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:51:27.961627 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.961596 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:51:27.999001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.998984 2565 manager.go:341] "Starting Device Plugin manager" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:27.999034 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.999048 2565 server.go:85] "Starting device plugin registration server" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.999325 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.999337 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.999440 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.999515 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:27.999523 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:27.999997 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:51:28.020262 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.000038 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.021894 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.021874 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-228.ec2.internal" not found Apr 17 07:51:28.037169 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.037129 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:51:28.038441 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.038413 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:51:28.038545 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.038448 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:51:28.038545 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.038477 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:51:28.038545 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.038486 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:51:28.038663 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.038583 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:51:28.041592 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.041573 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:28.100122 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.100042 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:28.100991 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.100968 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:28.101103 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.101005 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:28.101103 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.101019 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:28.101103 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.101046 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.110236 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.110203 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.110308 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.110241 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-228.ec2.internal\": node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.121607 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.121587 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.139054 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.139018 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal"] Apr 17 07:51:28.139122 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.139098 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:28.140096 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.140078 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:28.140152 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.140116 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:28.140152 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.140130 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:28.142451 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.142437 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:28.142667 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.142651 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.142724 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.142684 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:28.143296 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.143278 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:28.143296 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.143289 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:28.143432 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.143305 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:28.143432 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.143313 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:28.143432 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.143317 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:28.143432 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.143339 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:28.145478 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.145463 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.145565 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.145488 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:28.146250 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.146230 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:28.146344 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.146268 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:28.146344 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.146291 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:28.160862 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.160835 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-228.ec2.internal\" not found" node="ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.164955 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.164938 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-228.ec2.internal\" not found" node="ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.222432 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.222403 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.323482 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.323452 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.339795 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.339758 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1cba15509dfb2962e7fb3af060b0a393-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal\" (UID: \"1cba15509dfb2962e7fb3af060b0a393\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.339863 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.339800 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1cba15509dfb2962e7fb3af060b0a393-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal\" (UID: \"1cba15509dfb2962e7fb3af060b0a393\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.339863 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.339820 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6fa2d3d4e5da99e17b218f9fc59a91d2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-228.ec2.internal\" (UID: \"6fa2d3d4e5da99e17b218f9fc59a91d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.424249 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.424156 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.440592 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.440569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1cba15509dfb2962e7fb3af060b0a393-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal\" (UID: \"1cba15509dfb2962e7fb3af060b0a393\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.440695 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.440604 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1cba15509dfb2962e7fb3af060b0a393-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal\" (UID: \"1cba15509dfb2962e7fb3af060b0a393\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.440695 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.440672 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1cba15509dfb2962e7fb3af060b0a393-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal\" (UID: \"1cba15509dfb2962e7fb3af060b0a393\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.440759 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.440693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6fa2d3d4e5da99e17b218f9fc59a91d2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-228.ec2.internal\" (UID: \"6fa2d3d4e5da99e17b218f9fc59a91d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.440759 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.440721 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6fa2d3d4e5da99e17b218f9fc59a91d2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-228.ec2.internal\" (UID: \"6fa2d3d4e5da99e17b218f9fc59a91d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.440820 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.440750 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1cba15509dfb2962e7fb3af060b0a393-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal\" (UID: \"1cba15509dfb2962e7fb3af060b0a393\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.462708 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.462675 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.467419 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.467273 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" Apr 17 07:51:28.525061 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.525022 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.625649 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.625623 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.726106 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.726079 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.826653 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.826620 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.846112 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.846085 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:51:28.846298 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.846233 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:28.846298 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.846275 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:28.927743 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:28.927710 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:28.929879 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.929841 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:46:27 +0000 UTC" deadline="2027-11-09 01:14:20.016792929 +0000 UTC" Apr 17 07:51:28.929879 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.929872 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13697h22m51.086923827s" Apr 17 07:51:28.938053 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.938026 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:28.955668 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.955642 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:28.973874 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.973834 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9q2b4" Apr 17 07:51:28.981282 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:28.981239 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9q2b4" Apr 17 07:51:29.028611 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:29.028576 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:29.064295 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:29.064254 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa2d3d4e5da99e17b218f9fc59a91d2.slice/crio-ce67ce05d0d3111d1c97eab82756bfeb3da734f3ae3fe6eda5eeabb83f40e47a WatchSource:0}: Error finding container ce67ce05d0d3111d1c97eab82756bfeb3da734f3ae3fe6eda5eeabb83f40e47a: Status 404 returned error can't find the container with id ce67ce05d0d3111d1c97eab82756bfeb3da734f3ae3fe6eda5eeabb83f40e47a Apr 17 07:51:29.064877 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:29.064848 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cba15509dfb2962e7fb3af060b0a393.slice/crio-6ac4c5dd8018cd5dae704a3d6dce609af7fd2cb24453e2b79d53e128aa027e44 WatchSource:0}: Error finding container 6ac4c5dd8018cd5dae704a3d6dce609af7fd2cb24453e2b79d53e128aa027e44: Status 404 returned error can't find the container with id 6ac4c5dd8018cd5dae704a3d6dce609af7fd2cb24453e2b79d53e128aa027e44 Apr 17 07:51:29.069784 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.069766 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:51:29.129282 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:29.129249 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-228.ec2.internal\" not found" Apr 17 07:51:29.156052 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.156032 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:29.237799 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.237733 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" Apr 17 07:51:29.247887 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.247868 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:29.249432 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.249420 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" Apr 17 07:51:29.255336 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.255323 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:29.304231 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.304186 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:29.652945 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.652868 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:29.816694 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.816661 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:29.917172 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.917084 2565 apiserver.go:52] "Watching apiserver" Apr 17 07:51:29.922135 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.922107 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:51:29.923493 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.923466 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9","openshift-dns/node-resolver-wcz5n","openshift-image-registry/node-ca-mtwj6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal","openshift-multus/multus-additional-cni-plugins-9htcw","openshift-multus/multus-bgmzb","openshift-cluster-node-tuning-operator/tuned-rntgv","openshift-multus/network-metrics-daemon-k6mnq","openshift-network-diagnostics/network-check-target-z7tsc","openshift-network-operator/iptables-alerter-8vfz7","openshift-ovn-kubernetes/ovnkube-node-pqdwt","kube-system/konnectivity-agent-p5wkz","kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal"] Apr 17 07:51:29.928431 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.928403 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:29.930737 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.930696 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:29.931405 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.931154 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:29.931405 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.931155 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:51:29.931405 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.931259 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w9vhh\"" Apr 17 07:51:29.931656 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.931453 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:29.932491 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.932471 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:51:29.932737 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.932721 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qw9hw\"" Apr 17 07:51:29.932919 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.932883 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:29.933006 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.932989 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.933770 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.933752 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:51:29.935160 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935120 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:51:29.935394 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935371 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.935607 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935586 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:51:29.935678 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935643 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rf8rr\"" Apr 17 07:51:29.935735 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935587 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:51:29.935790 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935764 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:51:29.935790 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935775 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:51:29.935879 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935828 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:51:29.935928 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935876 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:51:29.936016 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.935996 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zpq7q\"" Apr 17 07:51:29.936909 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.936889 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:51:29.937248 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.937229 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:51:29.937853 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.937683 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nzr8b\"" Apr 17 07:51:29.940387 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.940368 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.940507 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.940485 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:29.940704 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:29.940671 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:29.942976 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.942810 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-crfbz\"" Apr 17 07:51:29.942976 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.942919 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:29.943149 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.943049 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:29.945332 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.945308 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:29.945434 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:29.945384 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:29.945434 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.945427 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:29.947429 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.947410 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:51:29.947517 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.947497 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:51:29.947576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.947539 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:51:29.947628 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.947618 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wxbs8\"" Apr 17 07:51:29.948076 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.948037 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:29.948851 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.948827 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rx5k\" (UniqueName: \"kubernetes.io/projected/7e100a52-e772-4d62-a573-6f5b62a4671d-kube-api-access-7rx5k\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.948942 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.948867 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:29.948942 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.948892 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrcs\" (UniqueName: \"kubernetes.io/projected/63918c32-1f1d-43f2-9243-76c8cb35d556-kube-api-access-jsrcs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:29.948942 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.948920 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-cni-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949095 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.948943 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-k8s-cni-cncf-io\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949095 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.948984 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-etc-kubernetes\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949095 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949008 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-var-lib-kubelet\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949095 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.949095 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949058 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysctl-d\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949110 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-sys\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949135 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-lib-modules\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949158 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-system-cni-dir\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.949368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949182 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-cnibin\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.949368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949205 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-socket-dir-parent\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949265 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-hostroot\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949331 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-conf-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949685 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949406 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-host\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949685 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949458 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbc24\" (UniqueName: \"kubernetes.io/projected/7c70fa88-6394-4033-b304-0ea283b0a7eb-kube-api-access-hbc24\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:29.949685 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949498 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-system-cni-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949685 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949524 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-cni-bin\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949685 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949547 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-kubelet\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949685 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bv4\" (UniqueName: \"kubernetes.io/projected/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-kube-api-access-k5bv4\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.949685 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949636 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/753ccc51-1724-4403-97ba-abea000798cc-etc-tuned\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949703 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/753ccc51-1724-4403-97ba-abea000798cc-tmp\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949746 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2e063f6-7071-4323-a26c-9d5f28ce786e-hosts-file\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949770 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkpz\" (UniqueName: \"kubernetes.io/projected/f2e063f6-7071-4323-a26c-9d5f28ce786e-kube-api-access-gjkpz\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949797 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-kubernetes\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949840 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-systemd\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949874 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7c70fa88-6394-4033-b304-0ea283b0a7eb-iptables-alerter-script\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949906 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-host\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949934 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2s9\" (UniqueName: \"kubernetes.io/projected/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-kube-api-access-4d2s9\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:29.949962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949962 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-os-release\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.950367 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.949990 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-cni-binary-copy\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.950367 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.950013 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-os-release\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.950367 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.950044 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-cni-multus\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.950367 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.950067 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-daemon-config\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.950367 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.950092 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2e063f6-7071-4323-a26c-9d5f28ce786e-tmp-dir\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:29.952453 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.951483 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8k57d\"" Apr 17 07:51:29.952453 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.952363 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:51:29.952611 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.952589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.952675 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.952627 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:29.952847 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.952818 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:51:29.953536 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.953132 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:29.953536 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.953378 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:51:29.953872 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.953851 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:51:29.953970 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.953914 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:51:29.954020 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.953972 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-cnibin\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.954020 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954006 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-cni-binary-copy\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.954109 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954041 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-multus-certs\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.954642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954159 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-modprobe-d\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.954642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954295 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysconfig\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.954642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954445 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysctl-conf\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.954642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954483 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c70fa88-6394-4033-b304-0ea283b0a7eb-host-slash\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:29.954642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954594 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-serviceca\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:29.954882 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954657 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-netns\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:29.954882 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954700 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-run\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.954882 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954742 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pt49\" (UniqueName: \"kubernetes.io/projected/753ccc51-1724-4403-97ba-abea000798cc-kube-api-access-9pt49\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:29.954882 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.954836 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:51:29.955789 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.955682 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9t7p7\"" Apr 17 07:51:29.955789 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.955716 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:51:29.955926 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.955873 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:51:29.982193 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.982160 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:28 +0000 UTC" deadline="2027-09-21 18:06:29.138473342 +0000 UTC" Apr 17 07:51:29.982193 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:29.982189 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12538h14m59.156286576s" Apr 17 07:51:30.040587 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.040558 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:51:30.042630 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.042576 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" event={"ID":"1cba15509dfb2962e7fb3af060b0a393","Type":"ContainerStarted","Data":"6ac4c5dd8018cd5dae704a3d6dce609af7fd2cb24453e2b79d53e128aa027e44"} Apr 17 07:51:30.043716 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.043689 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" event={"ID":"6fa2d3d4e5da99e17b218f9fc59a91d2","Type":"ContainerStarted","Data":"ce67ce05d0d3111d1c97eab82756bfeb3da734f3ae3fe6eda5eeabb83f40e47a"} Apr 17 07:51:30.055821 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.055791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-systemd\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.055959 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.055827 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.055959 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.055852 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-ovn\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.055959 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.055924 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-sys-fs\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.056094 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.055962 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-sys\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.056094 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.055997 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3-agent-certs\") pod \"konnectivity-agent-p5wkz\" (UID: \"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3\") " pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:30.056094 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056025 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-socket-dir-parent\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056094 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056036 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-sys\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.056094 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-hostroot\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056094 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056076 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-hostroot\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056105 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-conf-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-socket-dir-parent\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056133 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-host\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056163 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-var-lib-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056168 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-conf-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056190 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-cni-netd\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056197 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-host\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056234 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkxd8\" (UniqueName: \"kubernetes.io/projected/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-kube-api-access-rkxd8\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056259 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-device-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056284 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-host\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056319 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-host\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056334 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2s9\" (UniqueName: \"kubernetes.io/projected/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-kube-api-access-4d2s9\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:30.056360 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056365 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-cni-binary-copy\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056388 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv4\" (UniqueName: \"kubernetes.io/projected/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-kube-api-access-k5bv4\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056409 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/753ccc51-1724-4403-97ba-abea000798cc-tmp\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2e063f6-7071-4323-a26c-9d5f28ce786e-hosts-file\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056464 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-etc-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056630 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovn-node-metrics-cert\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056692 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-os-release\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056729 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056765 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-daemon-config\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056814 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2e063f6-7071-4323-a26c-9d5f28ce786e-tmp-dir\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:30.056973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.056975 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-cni-binary-copy\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057041 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057072 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-cni-bin\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057098 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057075 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2e063f6-7071-4323-a26c-9d5f28ce786e-hosts-file\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057115 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f2e063f6-7071-4323-a26c-9d5f28ce786e-tmp-dir\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057123 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-cnibin\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057137 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057150 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-cni-binary-copy\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057173 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-modprobe-d\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057196 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-serviceca\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057204 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-cnibin\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057240 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-run-netns\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057263 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovnkube-script-lib\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pt49\" (UniqueName: \"kubernetes.io/projected/753ccc51-1724-4403-97ba-abea000798cc-kube-api-access-9pt49\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057309 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057333 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrcs\" (UniqueName: \"kubernetes.io/projected/63918c32-1f1d-43f2-9243-76c8cb35d556-kube-api-access-jsrcs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057339 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-modprobe-d\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.057442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057356 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-env-overrides\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057402 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-cni-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057427 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-k8s-cni-cncf-io\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057456 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-etc-kubernetes\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057479 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovnkube-config\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057540 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysctl-d\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057564 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-lib-modules\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057564 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057588 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-system-cni-dir\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057610 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-cnibin\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057633 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-log-socket\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057659 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbc24\" (UniqueName: \"kubernetes.io/projected/7c70fa88-6394-4033-b304-0ea283b0a7eb-kube-api-access-hbc24\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-system-cni-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057704 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-cni-bin\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057726 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-kubelet\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057747 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/753ccc51-1724-4403-97ba-abea000798cc-etc-tuned\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.058284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057767 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-etc-kubernetes\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057766 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkpz\" (UniqueName: \"kubernetes.io/projected/f2e063f6-7071-4323-a26c-9d5f28ce786e-kube-api-access-gjkpz\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057794 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-serviceca\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057810 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x46l\" (UniqueName: \"kubernetes.io/projected/00007c9f-f097-4e98-b152-d5c5885a9d69-kube-api-access-5x46l\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057846 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-kubernetes\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057869 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-systemd\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057894 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7c70fa88-6394-4033-b304-0ea283b0a7eb-iptables-alerter-script\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.057903 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057916 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-etc-selinux\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057942 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-os-release\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057962 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-cni-multus\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057978 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysctl-d\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.057988 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.557957454 +0000 UTC m=+3.046838604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058002 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-cni-multus\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058039 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-cnibin\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058054 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-lib-modules\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058083 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.059045 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058086 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-k8s-cni-cncf-io\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058102 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-kubelet\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058108 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-kubernetes\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058119 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-systemd-units\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058134 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-socket-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058150 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-registration-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058160 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-cni-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058170 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-multus-certs\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058147 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-systemd\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058187 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysconfig\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058205 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysctl-conf\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057680 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-cni-binary-copy\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058246 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c70fa88-6394-4033-b304-0ea283b0a7eb-host-slash\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058267 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-node-log\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058284 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3-konnectivity-ca\") pod \"konnectivity-agent-p5wkz\" (UID: \"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3\") " pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058300 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-kubelet-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058328 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-netns\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.059816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058358 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-run\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058382 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rx5k\" (UniqueName: \"kubernetes.io/projected/7e100a52-e772-4d62-a573-6f5b62a4671d-kube-api-access-7rx5k\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058417 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-var-lib-kubelet\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058444 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058448 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-multus-certs\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058473 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-system-cni-dir\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058499 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-slash\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058513 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysconfig\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058588 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-run-netns\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058630 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-run\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058646 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7c70fa88-6394-4033-b304-0ea283b0a7eb-iptables-alerter-script\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058696 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-os-release\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058743 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-os-release\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058763 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-etc-sysctl-conf\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.057701 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-multus-daemon-config\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-system-cni-dir\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058824 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e100a52-e772-4d62-a573-6f5b62a4671d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.060424 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058826 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c70fa88-6394-4033-b304-0ea283b0a7eb-host-slash\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:30.061143 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058839 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/753ccc51-1724-4403-97ba-abea000798cc-var-lib-kubelet\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.061143 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-kubelet\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.061143 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.058873 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-host-var-lib-cni-bin\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.061143 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.060722 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7e100a52-e772-4d62-a573-6f5b62a4671d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.061143 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.060962 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/753ccc51-1724-4403-97ba-abea000798cc-etc-tuned\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.064691 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.064668 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/753ccc51-1724-4403-97ba-abea000798cc-tmp\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.065130 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.065098 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bv4\" (UniqueName: \"kubernetes.io/projected/b8c32bd7-8e6a-401c-93fe-49b96703cd7a-kube-api-access-k5bv4\") pod \"multus-bgmzb\" (UID: \"b8c32bd7-8e6a-401c-93fe-49b96703cd7a\") " pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.065849 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.065825 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2s9\" (UniqueName: \"kubernetes.io/projected/b2e0178e-22bf-4ec0-8752-a62c91d1d7a5-kube-api-access-4d2s9\") pod \"node-ca-mtwj6\" (UID: \"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5\") " pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:30.068007 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.067983 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pt49\" (UniqueName: \"kubernetes.io/projected/753ccc51-1724-4403-97ba-abea000798cc-kube-api-access-9pt49\") pod \"tuned-rntgv\" (UID: \"753ccc51-1724-4403-97ba-abea000798cc\") " pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.069383 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.069194 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbc24\" (UniqueName: \"kubernetes.io/projected/7c70fa88-6394-4033-b304-0ea283b0a7eb-kube-api-access-hbc24\") pod \"iptables-alerter-8vfz7\" (UID: \"7c70fa88-6394-4033-b304-0ea283b0a7eb\") " pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:30.069541 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.069525 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrcs\" (UniqueName: \"kubernetes.io/projected/63918c32-1f1d-43f2-9243-76c8cb35d556-kube-api-access-jsrcs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:30.070256 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.070237 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkpz\" (UniqueName: \"kubernetes.io/projected/f2e063f6-7071-4323-a26c-9d5f28ce786e-kube-api-access-gjkpz\") pod \"node-resolver-wcz5n\" (UID: \"f2e063f6-7071-4323-a26c-9d5f28ce786e\") " pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:30.070574 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.070551 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rx5k\" (UniqueName: \"kubernetes.io/projected/7e100a52-e772-4d62-a573-6f5b62a4671d-kube-api-access-7rx5k\") pod \"multus-additional-cni-plugins-9htcw\" (UID: \"7e100a52-e772-4d62-a573-6f5b62a4671d\") " pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.159747 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159709 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-log-socket\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159760 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x46l\" (UniqueName: \"kubernetes.io/projected/00007c9f-f097-4e98-b152-d5c5885a9d69-kube-api-access-5x46l\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159787 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-etc-selinux\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159811 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-kubelet\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159834 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-systemd-units\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159834 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-log-socket\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159856 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-socket-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159881 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-registration-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159903 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-systemd-units\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.159916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159904 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-node-log\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159934 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3-konnectivity-ca\") pod \"konnectivity-agent-p5wkz\" (UID: \"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3\") " pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159937 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-node-log\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159953 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-etc-selinux\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159993 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-registration-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.159996 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-kubelet\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160022 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-kubelet-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-slash\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160064 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-kubelet-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160099 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-systemd\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160123 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-slash\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-socket-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160127 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160158 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160171 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-systemd\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160177 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-ovn\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-run-ovn\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.160368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160235 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-sys-fs\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160266 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3-agent-certs\") pod \"konnectivity-agent-p5wkz\" (UID: \"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3\") " pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160296 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-var-lib-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160321 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-cni-netd\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160330 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-sys-fs\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160346 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkxd8\" (UniqueName: \"kubernetes.io/projected/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-kube-api-access-rkxd8\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-device-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160400 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-cni-netd\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160436 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-etc-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160402 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-etc-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160473 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovn-node-metrics-cert\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160488 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3-konnectivity-ca\") pod \"konnectivity-agent-p5wkz\" (UID: \"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3\") " pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160502 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160531 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-cni-bin\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160545 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/00007c9f-f097-4e98-b152-d5c5885a9d69-device-dir\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160555 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160581 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-run-netns\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161148 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160585 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160606 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovnkube-script-lib\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160372 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-var-lib-openvswitch\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160668 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-run-netns\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160694 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-cni-bin\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160808 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-env-overrides\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160840 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.160864 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovnkube-config\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.161772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.161040 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-host-run-ovn-kubernetes\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.162299 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.162278 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-env-overrides\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.162537 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.162497 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovnkube-script-lib\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.162662 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.162649 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovnkube-config\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.162977 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.162958 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3-agent-certs\") pod \"konnectivity-agent-p5wkz\" (UID: \"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3\") " pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:30.163047 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.163017 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-ovn-node-metrics-cert\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.165934 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.165914 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:30.165934 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.165937 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:30.166060 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.165950 2565 projected.go:194] Error preparing data for projected volume kube-api-access-6l74r for pod openshift-network-diagnostics/network-check-target-z7tsc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.166060 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.166016 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r podName:919fa45a-692a-4f75-a7ff-12f0085459ab nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.66599829 +0000 UTC m=+3.154879434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6l74r" (UniqueName: "kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r") pod "network-check-target-z7tsc" (UID: "919fa45a-692a-4f75-a7ff-12f0085459ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.167848 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.167798 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x46l\" (UniqueName: \"kubernetes.io/projected/00007c9f-f097-4e98-b152-d5c5885a9d69-kube-api-access-5x46l\") pod \"aws-ebs-csi-driver-node-249j9\" (UID: \"00007c9f-f097-4e98-b152-d5c5885a9d69\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.168043 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.168028 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkxd8\" (UniqueName: \"kubernetes.io/projected/c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2-kube-api-access-rkxd8\") pod \"ovnkube-node-pqdwt\" (UID: \"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2\") " pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.243334 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.243304 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8vfz7" Apr 17 07:51:30.251936 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.251914 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wcz5n" Apr 17 07:51:30.263666 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.263640 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mtwj6" Apr 17 07:51:30.270273 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.270251 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:30.276870 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.276849 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9htcw" Apr 17 07:51:30.283509 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.283485 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bgmzb" Apr 17 07:51:30.288154 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.288134 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rntgv" Apr 17 07:51:30.294763 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.294742 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" Apr 17 07:51:30.299367 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.299346 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:30.563440 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.563408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:30.563606 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.563539 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.563606 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.563597 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:31.563583066 +0000 UTC m=+4.052464196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.765276 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.765245 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:30.765430 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.765385 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:30.765430 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.765399 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:30.765430 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.765410 2565 projected.go:194] Error preparing data for projected volume kube-api-access-6l74r for pod openshift-network-diagnostics/network-check-target-z7tsc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.765520 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:30.765460 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r podName:919fa45a-692a-4f75-a7ff-12f0085459ab nodeName:}" failed. No retries permitted until 2026-04-17 07:51:31.765447813 +0000 UTC m=+4.254328942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6l74r" (UniqueName: "kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r") pod "network-check-target-z7tsc" (UID: "919fa45a-692a-4f75-a7ff-12f0085459ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.785492 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.785456 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00007c9f_f097_4e98_b152_d5c5885a9d69.slice/crio-62451e8c32a45ce74cb8fda5cfe8af631e037a5cb1d9a277979f82f440c3b599 WatchSource:0}: Error finding container 62451e8c32a45ce74cb8fda5cfe8af631e037a5cb1d9a277979f82f440c3b599: Status 404 returned error can't find the container with id 62451e8c32a45ce74cb8fda5cfe8af631e037a5cb1d9a277979f82f440c3b599 Apr 17 07:51:30.789127 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.789099 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c32bd7_8e6a_401c_93fe_49b96703cd7a.slice/crio-dd7b86d351e1c172d304946cc01dac07cbcd2a79f33a929a4932fc56092d0d22 WatchSource:0}: Error finding container dd7b86d351e1c172d304946cc01dac07cbcd2a79f33a929a4932fc56092d0d22: Status 404 returned error can't find the container with id dd7b86d351e1c172d304946cc01dac07cbcd2a79f33a929a4932fc56092d0d22 Apr 17 07:51:30.791238 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.791197 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c70fa88_6394_4033_b304_0ea283b0a7eb.slice/crio-f8e924162efc83ab8c1d3cd5b46066dca36b5792b00fb463d0c1e52c277174c2 WatchSource:0}: Error finding container f8e924162efc83ab8c1d3cd5b46066dca36b5792b00fb463d0c1e52c277174c2: Status 404 returned error can't find the container with id f8e924162efc83ab8c1d3cd5b46066dca36b5792b00fb463d0c1e52c277174c2 Apr 17 07:51:30.792111 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.791814 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec96c0de_0a97_4ee9_91ef_9bcb0e0c73d3.slice/crio-511a1df9f178c992551245b2aaa1019082e27ceadd148d673023793833e8274d WatchSource:0}: Error finding container 511a1df9f178c992551245b2aaa1019082e27ceadd148d673023793833e8274d: Status 404 returned error can't find the container with id 511a1df9f178c992551245b2aaa1019082e27ceadd148d673023793833e8274d Apr 17 07:51:30.794879 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.794608 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e0178e_22bf_4ec0_8752_a62c91d1d7a5.slice/crio-ff19ae1b039ad4999c87cbdbbe11bef225a8bedded78fb8e082d013588454502 WatchSource:0}: Error finding container ff19ae1b039ad4999c87cbdbbe11bef225a8bedded78fb8e082d013588454502: Status 404 returned error can't find the container with id ff19ae1b039ad4999c87cbdbbe11bef225a8bedded78fb8e082d013588454502 Apr 17 07:51:30.795709 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.795682 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3bd4b80_ecb6_4dd0_a2e6_88f1d0f483d2.slice/crio-5bec5fce69bc6294db4706b0072c263e288eebe3436ae8966d5e7d58c3585f39 WatchSource:0}: Error finding container 5bec5fce69bc6294db4706b0072c263e288eebe3436ae8966d5e7d58c3585f39: Status 404 returned error can't find the container with id 5bec5fce69bc6294db4706b0072c263e288eebe3436ae8966d5e7d58c3585f39 Apr 17 07:51:30.796564 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.796538 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e100a52_e772_4d62_a573_6f5b62a4671d.slice/crio-26e7b97bc693071df7fcde768e8d81dbcfa22be09fa8557d1b63ec277396ab69 WatchSource:0}: Error finding container 26e7b97bc693071df7fcde768e8d81dbcfa22be09fa8557d1b63ec277396ab69: Status 404 returned error can't find the container with id 26e7b97bc693071df7fcde768e8d81dbcfa22be09fa8557d1b63ec277396ab69 Apr 17 07:51:30.797418 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.797351 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753ccc51_1724_4403_97ba_abea000798cc.slice/crio-7db305a8139ad32e682cea529693e3edb31ea0a445bb5bde2dee953b647e5253 WatchSource:0}: Error finding container 7db305a8139ad32e682cea529693e3edb31ea0a445bb5bde2dee953b647e5253: Status 404 returned error can't find the container with id 7db305a8139ad32e682cea529693e3edb31ea0a445bb5bde2dee953b647e5253 Apr 17 07:51:30.798553 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:51:30.798270 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e063f6_7071_4323_a26c_9d5f28ce786e.slice/crio-f3317f3a06f1e331750fd9937fc521d46af0062c914fed232a9bd79140393b24 WatchSource:0}: Error finding container f3317f3a06f1e331750fd9937fc521d46af0062c914fed232a9bd79140393b24: Status 404 returned error can't find the container with id f3317f3a06f1e331750fd9937fc521d46af0062c914fed232a9bd79140393b24 Apr 17 07:51:30.983492 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.983286 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:28 +0000 UTC" deadline="2027-12-08 02:45:12.961209887 +0000 UTC" Apr 17 07:51:30.983492 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:30.983488 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14394h53m41.97772625s" Apr 17 07:51:31.048132 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.048097 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" event={"ID":"6fa2d3d4e5da99e17b218f9fc59a91d2","Type":"ContainerStarted","Data":"b3f543c1f3ed869feb6e677645dd7e43b7c544868d94558c8ea60982c72b6585"} Apr 17 07:51:31.049184 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.049158 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerStarted","Data":"26e7b97bc693071df7fcde768e8d81dbcfa22be09fa8557d1b63ec277396ab69"} Apr 17 07:51:31.050101 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.050078 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"5bec5fce69bc6294db4706b0072c263e288eebe3436ae8966d5e7d58c3585f39"} Apr 17 07:51:31.051009 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.050986 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mtwj6" event={"ID":"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5","Type":"ContainerStarted","Data":"ff19ae1b039ad4999c87cbdbbe11bef225a8bedded78fb8e082d013588454502"} Apr 17 07:51:31.052065 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.052041 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" event={"ID":"00007c9f-f097-4e98-b152-d5c5885a9d69","Type":"ContainerStarted","Data":"62451e8c32a45ce74cb8fda5cfe8af631e037a5cb1d9a277979f82f440c3b599"} Apr 17 07:51:31.053191 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.053162 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wcz5n" event={"ID":"f2e063f6-7071-4323-a26c-9d5f28ce786e","Type":"ContainerStarted","Data":"f3317f3a06f1e331750fd9937fc521d46af0062c914fed232a9bd79140393b24"} Apr 17 07:51:31.054452 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.054427 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rntgv" event={"ID":"753ccc51-1724-4403-97ba-abea000798cc","Type":"ContainerStarted","Data":"7db305a8139ad32e682cea529693e3edb31ea0a445bb5bde2dee953b647e5253"} Apr 17 07:51:31.055988 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.055964 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p5wkz" event={"ID":"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3","Type":"ContainerStarted","Data":"511a1df9f178c992551245b2aaa1019082e27ceadd148d673023793833e8274d"} Apr 17 07:51:31.057678 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.057644 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8vfz7" event={"ID":"7c70fa88-6394-4033-b304-0ea283b0a7eb","Type":"ContainerStarted","Data":"f8e924162efc83ab8c1d3cd5b46066dca36b5792b00fb463d0c1e52c277174c2"} Apr 17 07:51:31.058525 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.058497 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bgmzb" event={"ID":"b8c32bd7-8e6a-401c-93fe-49b96703cd7a","Type":"ContainerStarted","Data":"dd7b86d351e1c172d304946cc01dac07cbcd2a79f33a929a4932fc56092d0d22"} Apr 17 07:51:31.059598 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.059553 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-228.ec2.internal" podStartSLOduration=2.059541207 podStartE2EDuration="2.059541207s" podCreationTimestamp="2026-04-17 07:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:31.059234374 +0000 UTC m=+3.548115525" watchObservedRunningTime="2026-04-17 07:51:31.059541207 +0000 UTC m=+3.548422422" Apr 17 07:51:31.574871 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.574306 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:31.574871 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:31.574464 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:31.574871 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:31.574525 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:33.574507314 +0000 UTC m=+6.063388451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:31.776693 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:31.776640 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:31.776886 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:31.776840 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:31.776954 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:31.776897 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:31.776954 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:31.776911 2565 projected.go:194] Error preparing data for projected volume kube-api-access-6l74r for pod openshift-network-diagnostics/network-check-target-z7tsc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:31.777052 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:31.776972 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r podName:919fa45a-692a-4f75-a7ff-12f0085459ab nodeName:}" failed. No retries permitted until 2026-04-17 07:51:33.77695288 +0000 UTC m=+6.265834016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6l74r" (UniqueName: "kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r") pod "network-check-target-z7tsc" (UID: "919fa45a-692a-4f75-a7ff-12f0085459ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:32.039578 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:32.039544 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:32.040175 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:32.039696 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:32.043430 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:32.040682 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:32.043430 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:32.040816 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:32.068110 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:32.068071 2565 generic.go:358] "Generic (PLEG): container finished" podID="1cba15509dfb2962e7fb3af060b0a393" containerID="3060b2a64ab2daa9fb289af1ce4db756c55c898bb7287c42c453a2a6e148f64a" exitCode=0 Apr 17 07:51:32.068317 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:32.068198 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" event={"ID":"1cba15509dfb2962e7fb3af060b0a393","Type":"ContainerDied","Data":"3060b2a64ab2daa9fb289af1ce4db756c55c898bb7287c42c453a2a6e148f64a"} Apr 17 07:51:33.085190 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:33.085146 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" event={"ID":"1cba15509dfb2962e7fb3af060b0a393","Type":"ContainerStarted","Data":"1e7b55c07a3e292ce4a56ab55761695b6ce508e8b2e9d2526adf9920d64ec50b"} Apr 17 07:51:33.098417 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:33.098342 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-228.ec2.internal" podStartSLOduration=4.098322544 podStartE2EDuration="4.098322544s" podCreationTimestamp="2026-04-17 07:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:33.097967773 +0000 UTC m=+5.586848928" watchObservedRunningTime="2026-04-17 07:51:33.098322544 +0000 UTC m=+5.587203697" Apr 17 07:51:33.594865 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:33.594824 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:33.595050 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:33.594970 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:33.595050 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:33.595038 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:37.595017741 +0000 UTC m=+10.083898886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:33.796539 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:33.796498 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:33.796799 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:33.796710 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:33.796799 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:33.796735 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:33.796799 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:33.796748 2565 projected.go:194] Error preparing data for projected volume kube-api-access-6l74r for pod openshift-network-diagnostics/network-check-target-z7tsc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:33.797005 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:33.796810 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r podName:919fa45a-692a-4f75-a7ff-12f0085459ab nodeName:}" failed. No retries permitted until 2026-04-17 07:51:37.796790575 +0000 UTC m=+10.285671704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6l74r" (UniqueName: "kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r") pod "network-check-target-z7tsc" (UID: "919fa45a-692a-4f75-a7ff-12f0085459ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:34.038915 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:34.038882 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:34.039095 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:34.039014 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:34.039435 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:34.039412 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:34.039563 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:34.039538 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:36.039014 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:36.038978 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:36.039636 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:36.039027 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:36.039636 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:36.039122 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:36.039636 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:36.039290 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:37.629415 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:37.629372 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:37.629970 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:37.629510 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:37.629970 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:37.629579 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:45.62955863 +0000 UTC m=+18.118439779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:37.831162 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:37.831119 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:37.831364 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:37.831324 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:37.831364 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:37.831351 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:37.831364 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:37.831365 2565 projected.go:194] Error preparing data for projected volume kube-api-access-6l74r for pod openshift-network-diagnostics/network-check-target-z7tsc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:37.831520 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:37.831436 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r podName:919fa45a-692a-4f75-a7ff-12f0085459ab nodeName:}" failed. No retries permitted until 2026-04-17 07:51:45.831415174 +0000 UTC m=+18.320296317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6l74r" (UniqueName: "kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r") pod "network-check-target-z7tsc" (UID: "919fa45a-692a-4f75-a7ff-12f0085459ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:38.039686 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:38.039631 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:38.039878 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:38.039756 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:38.040136 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:38.040114 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:38.040246 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:38.040226 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:40.038867 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:40.038790 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:40.039461 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:40.038975 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:40.039461 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:40.039300 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:40.039461 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:40.039414 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:42.039396 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:42.039360 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:42.039396 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:42.039394 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:42.039867 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:42.039496 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:42.039867 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:42.039610 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:44.044174 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:44.044147 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:44.044174 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:44.044159 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:44.044658 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:44.044281 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:44.044658 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:44.044428 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:45.687921 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:45.687880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:45.688414 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:45.688043 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:45.688414 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:45.688118 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:01.688101065 +0000 UTC m=+34.176982195 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:45.889356 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:45.889313 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:45.889528 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:45.889497 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:45.889528 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:45.889520 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:45.889528 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:45.889530 2565 projected.go:194] Error preparing data for projected volume kube-api-access-6l74r for pod openshift-network-diagnostics/network-check-target-z7tsc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:45.889704 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:45.889592 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r podName:919fa45a-692a-4f75-a7ff-12f0085459ab nodeName:}" failed. No retries permitted until 2026-04-17 07:52:01.889572569 +0000 UTC m=+34.378453714 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6l74r" (UniqueName: "kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r") pod "network-check-target-z7tsc" (UID: "919fa45a-692a-4f75-a7ff-12f0085459ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:46.038878 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:46.038847 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:46.039063 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:46.038847 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:46.039063 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:46.038942 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:46.039063 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:46.039034 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:47.115172 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.115136 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mzn5t"] Apr 17 07:51:47.215342 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.215305 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.215519 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:47.215397 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:51:47.300512 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.300471 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a1d097b8-d651-4c5a-aee8-e970c942c7bd-dbus\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.300512 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.300519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.300738 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.300547 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a1d097b8-d651-4c5a-aee8-e970c942c7bd-kubelet-config\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.400944 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.400858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a1d097b8-d651-4c5a-aee8-e970c942c7bd-kubelet-config\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.401117 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.400968 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a1d097b8-d651-4c5a-aee8-e970c942c7bd-dbus\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.401117 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.400997 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.401117 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.401017 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a1d097b8-d651-4c5a-aee8-e970c942c7bd-kubelet-config\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.401267 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:47.401137 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:47.401267 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:47.401196 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret podName:a1d097b8-d651-4c5a-aee8-e970c942c7bd nodeName:}" failed. No retries permitted until 2026-04-17 07:51:47.901177937 +0000 UTC m=+20.390059074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret") pod "global-pull-secret-syncer-mzn5t" (UID: "a1d097b8-d651-4c5a-aee8-e970c942c7bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:47.401267 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.401136 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a1d097b8-d651-4c5a-aee8-e970c942c7bd-dbus\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.904517 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:47.904313 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:47.904630 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:47.904599 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:47.904677 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:47.904656 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret podName:a1d097b8-d651-4c5a-aee8-e970c942c7bd nodeName:}" failed. No retries permitted until 2026-04-17 07:51:48.90463348 +0000 UTC m=+21.393514609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret") pod "global-pull-secret-syncer-mzn5t" (UID: "a1d097b8-d651-4c5a-aee8-e970c942c7bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:48.039586 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.039553 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:48.039702 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:48.039681 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:48.039763 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.039727 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:48.039813 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:48.039785 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:48.111733 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.111699 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e100a52-e772-4d62-a573-6f5b62a4671d" containerID="25e63ac113e86e03a95f882dda1b430810f578757a9dff816869b94a07b7cea4" exitCode=0 Apr 17 07:51:48.111878 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.111789 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerDied","Data":"25e63ac113e86e03a95f882dda1b430810f578757a9dff816869b94a07b7cea4"} Apr 17 07:51:48.113567 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.113406 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 07:51:48.113981 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.113955 2565 generic.go:358] "Generic (PLEG): container finished" podID="c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2" containerID="69e6f9758f988bc867c30892fc0dfd9e4dbf77b24187e0e1c70fef21343de5c3" exitCode=1 Apr 17 07:51:48.114110 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.114030 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerDied","Data":"69e6f9758f988bc867c30892fc0dfd9e4dbf77b24187e0e1c70fef21343de5c3"} Apr 17 07:51:48.114110 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.114066 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"de7eb80a601e33b3bf7e306223cf918b997026f6e179fbc809d1c2ee4896d142"} Apr 17 07:51:48.115630 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.115583 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mtwj6" event={"ID":"b2e0178e-22bf-4ec0-8752-a62c91d1d7a5","Type":"ContainerStarted","Data":"cc224eab1753c6564b21253faed4e6ab8737777952815e9065fcd1581a1763c7"} Apr 17 07:51:48.117484 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.117435 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" event={"ID":"00007c9f-f097-4e98-b152-d5c5885a9d69","Type":"ContainerStarted","Data":"a160d857cb88fd3fcab9cc0086ee96480a49b522386c10fe32237f6908059472"} Apr 17 07:51:48.118759 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.118739 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wcz5n" event={"ID":"f2e063f6-7071-4323-a26c-9d5f28ce786e","Type":"ContainerStarted","Data":"a31cb8a739e20581bcf7517582a6a2d9754ae3f010a47226147477513913b8a2"} Apr 17 07:51:48.120194 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.120160 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rntgv" event={"ID":"753ccc51-1724-4403-97ba-abea000798cc","Type":"ContainerStarted","Data":"d3792936a0a170906589e9b200c64e11d6a7035dfa74bb91f60a62e14413eb03"} Apr 17 07:51:48.121548 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.121534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p5wkz" event={"ID":"ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3","Type":"ContainerStarted","Data":"905e120990bd8fc1d44c5f01584796f5fcb41fdc9f7e868eba187f0a57a87ce3"} Apr 17 07:51:48.122784 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.122763 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bgmzb" event={"ID":"b8c32bd7-8e6a-401c-93fe-49b96703cd7a","Type":"ContainerStarted","Data":"ba6530a14c302576e9785b651850da1df3cd38b890531086be9914a417122d80"} Apr 17 07:51:48.154913 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.153379 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rntgv" podStartSLOduration=3.391748842 podStartE2EDuration="20.153358891s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.799676696 +0000 UTC m=+3.288557828" lastFinishedPulling="2026-04-17 07:51:47.561286734 +0000 UTC m=+20.050167877" observedRunningTime="2026-04-17 07:51:48.152363744 +0000 UTC m=+20.641244888" watchObservedRunningTime="2026-04-17 07:51:48.153358891 +0000 UTC m=+20.642240044" Apr 17 07:51:48.184447 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.184403 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p5wkz" podStartSLOduration=3.41998324 podStartE2EDuration="20.184387103s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.795014224 +0000 UTC m=+3.283895354" lastFinishedPulling="2026-04-17 07:51:47.559418074 +0000 UTC m=+20.048299217" observedRunningTime="2026-04-17 07:51:48.167319133 +0000 UTC m=+20.656200284" watchObservedRunningTime="2026-04-17 07:51:48.184387103 +0000 UTC m=+20.673268252" Apr 17 07:51:48.184716 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.184696 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bgmzb" podStartSLOduration=3.381311389 podStartE2EDuration="20.184691421s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.79087605 +0000 UTC m=+3.279757180" lastFinishedPulling="2026-04-17 07:51:47.594256078 +0000 UTC m=+20.083137212" observedRunningTime="2026-04-17 07:51:48.184252462 +0000 UTC m=+20.673133611" watchObservedRunningTime="2026-04-17 07:51:48.184691421 +0000 UTC m=+20.673572573" Apr 17 07:51:48.198933 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.198890 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wcz5n" podStartSLOduration=3.439903425 podStartE2EDuration="20.198875935s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.800659348 +0000 UTC m=+3.289540478" lastFinishedPulling="2026-04-17 07:51:47.559631853 +0000 UTC m=+20.048512988" observedRunningTime="2026-04-17 07:51:48.198661669 +0000 UTC m=+20.687542822" watchObservedRunningTime="2026-04-17 07:51:48.198875935 +0000 UTC m=+20.687757086" Apr 17 07:51:48.215203 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.215159 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mtwj6" podStartSLOduration=3.452522907 podStartE2EDuration="20.215143598s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.796690714 +0000 UTC m=+3.285571843" lastFinishedPulling="2026-04-17 07:51:47.559311403 +0000 UTC m=+20.048192534" observedRunningTime="2026-04-17 07:51:48.214736939 +0000 UTC m=+20.703618088" watchObservedRunningTime="2026-04-17 07:51:48.215143598 +0000 UTC m=+20.704024750" Apr 17 07:51:48.911376 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:48.911275 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:48.911536 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:48.911446 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:48.911536 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:48.911522 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret podName:a1d097b8-d651-4c5a-aee8-e970c942c7bd nodeName:}" failed. No retries permitted until 2026-04-17 07:51:50.911503827 +0000 UTC m=+23.400384969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret") pod "global-pull-secret-syncer-mzn5t" (UID: "a1d097b8-d651-4c5a-aee8-e970c942c7bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:49.038762 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.038726 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:49.038905 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:49.038855 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:51:49.126529 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.126490 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8vfz7" event={"ID":"7c70fa88-6394-4033-b304-0ea283b0a7eb","Type":"ContainerStarted","Data":"b39b6567b63d8a88b6f1082191bd0a00796981e2c9eeebb8774a5a0202ff23fd"} Apr 17 07:51:49.129332 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.129303 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 07:51:49.129731 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.129699 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"93d093c1b541cfc8966f5a7fa9a8224c17b2c61ceba43b094d32c85c43389b12"} Apr 17 07:51:49.129837 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.129739 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"6b2ecbbf7f41c83252ae91d9e6db9b7ba4d654bd5b337609f8bc27db214e0ad1"} Apr 17 07:51:49.129837 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.129755 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"7051b6fbed935ecdfbab0efc50f49f69626d0c514bfbe77a954fe6424c7014e4"} Apr 17 07:51:49.129837 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.129767 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"e7e6b802b80811c5db72290d2dcc9b8895d41d8d1782a46bd5626d2e53bfda78"} Apr 17 07:51:49.142283 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.142235 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8vfz7" podStartSLOduration=4.376922814 podStartE2EDuration="21.142203781s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.794138167 +0000 UTC m=+3.283019297" lastFinishedPulling="2026-04-17 07:51:47.559419114 +0000 UTC m=+20.048300264" observedRunningTime="2026-04-17 07:51:49.142149465 +0000 UTC m=+21.631030630" watchObservedRunningTime="2026-04-17 07:51:49.142203781 +0000 UTC m=+21.631084935" Apr 17 07:51:49.308817 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:49.308793 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:51:50.012739 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:50.012617 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:51:49.308813807Z","UUID":"fbfe0d6d-2854-461f-9c0e-c799534fade4","Handler":null,"Name":"","Endpoint":""} Apr 17 07:51:50.015515 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:50.015492 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:51:50.015515 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:50.015523 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:51:50.039001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:50.038967 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:50.039148 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:50.039088 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:50.039328 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:50.039306 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:50.039449 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:50.039426 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:50.133265 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:50.133228 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" event={"ID":"00007c9f-f097-4e98-b152-d5c5885a9d69","Type":"ContainerStarted","Data":"89d76b3e962d796ba4dca152751cd1759769e9cd11e98bdd03681bde697fbcb1"} Apr 17 07:51:50.926385 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:50.926338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:50.926555 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:50.926503 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:50.926611 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:50.926582 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret podName:a1d097b8-d651-4c5a-aee8-e970c942c7bd nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.926560766 +0000 UTC m=+27.415441938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret") pod "global-pull-secret-syncer-mzn5t" (UID: "a1d097b8-d651-4c5a-aee8-e970c942c7bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:51.039587 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:51.039401 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:51.039749 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:51.039672 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:51:51.137784 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:51.137758 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 07:51:51.138171 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:51.138135 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"1f5a1a16fe0ace59e4e2e4a9643de2a00cc74831c22a4e9767a478ec97f7773a"} Apr 17 07:51:51.140017 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:51.139992 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" event={"ID":"00007c9f-f097-4e98-b152-d5c5885a9d69","Type":"ContainerStarted","Data":"247d55c6b0f2b999254659aac86f576e60a4e03742e687e4b7e2891fa9902954"} Apr 17 07:51:51.156583 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:51.156525 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-249j9" podStartSLOduration=3.615132829 podStartE2EDuration="23.156505448s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.788060841 +0000 UTC m=+3.276941989" lastFinishedPulling="2026-04-17 07:51:50.329433462 +0000 UTC m=+22.818314608" observedRunningTime="2026-04-17 07:51:51.155992296 +0000 UTC m=+23.644873485" watchObservedRunningTime="2026-04-17 07:51:51.156505448 +0000 UTC m=+23.645386601" Apr 17 07:51:52.039092 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:52.039061 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:52.039316 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:52.039060 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:52.039316 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:52.039172 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:52.039316 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:52.039304 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:52.417824 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:52.417776 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:52.418368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:52.418333 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:53.039480 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.039447 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:53.039625 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:53.039554 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:51:53.147801 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.147765 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e100a52-e772-4d62-a573-6f5b62a4671d" containerID="3ed0867c12f452694156e11cc853b25167b6896cef53e15593f3b5d446e24b11" exitCode=0 Apr 17 07:51:53.147957 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.147837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerDied","Data":"3ed0867c12f452694156e11cc853b25167b6896cef53e15593f3b5d446e24b11"} Apr 17 07:51:53.151097 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.151077 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 07:51:53.151492 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.151469 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"2764887025f9c384c0ea5fa03543397595b9b6c6dd9778d2dbbb6cf7ca0e994b"} Apr 17 07:51:53.151721 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.151699 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:53.151953 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.151926 2565 scope.go:117] "RemoveContainer" containerID="69e6f9758f988bc867c30892fc0dfd9e4dbf77b24187e0e1c70fef21343de5c3" Apr 17 07:51:53.152329 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:53.152307 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p5wkz" Apr 17 07:51:54.039445 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.039160 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:54.039445 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.039448 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:54.039969 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:54.039709 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:54.039969 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:54.039606 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:54.155617 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.155588 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e100a52-e772-4d62-a573-6f5b62a4671d" containerID="048a29435a6a75e01411d1bf8a2ba9c7519727c5b2e9cc6f9c3417a97cb04bb7" exitCode=0 Apr 17 07:51:54.155779 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.155665 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerDied","Data":"048a29435a6a75e01411d1bf8a2ba9c7519727c5b2e9cc6f9c3417a97cb04bb7"} Apr 17 07:51:54.159129 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.159113 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 07:51:54.159476 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.159448 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" event={"ID":"c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2","Type":"ContainerStarted","Data":"05d36075b13f8e4fb12ce56c2d1d5d23d5a9ab2d58c747241f1dda4c9736df27"} Apr 17 07:51:54.159835 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.159809 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:54.159933 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.159849 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:54.159933 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.159863 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:54.174263 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.174238 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:54.179101 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.179080 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:51:54.199868 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.199822 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" podStartSLOduration=9.178744062 podStartE2EDuration="26.199807328s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.797358382 +0000 UTC m=+3.286239518" lastFinishedPulling="2026-04-17 07:51:47.81842164 +0000 UTC m=+20.307302784" observedRunningTime="2026-04-17 07:51:54.19949907 +0000 UTC m=+26.688380217" watchObservedRunningTime="2026-04-17 07:51:54.199807328 +0000 UTC m=+26.688688504" Apr 17 07:51:54.604280 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.604238 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k6mnq"] Apr 17 07:51:54.604463 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.604441 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:54.604590 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:54.604563 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:54.604773 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.604750 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mzn5t"] Apr 17 07:51:54.604885 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.604846 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:54.604999 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:54.604939 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:51:54.607294 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.607273 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z7tsc"] Apr 17 07:51:54.607402 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.607349 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:54.607458 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:54.607417 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:54.961047 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:54.960986 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:54.961251 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:54.961156 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:54.961251 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:54.961244 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret podName:a1d097b8-d651-4c5a-aee8-e970c942c7bd nodeName:}" failed. No retries permitted until 2026-04-17 07:52:02.961225742 +0000 UTC m=+35.450106888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret") pod "global-pull-secret-syncer-mzn5t" (UID: "a1d097b8-d651-4c5a-aee8-e970c942c7bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:55.163076 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:55.162991 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e100a52-e772-4d62-a573-6f5b62a4671d" containerID="f52400e3fdddeb87fcc357c5767d358e30996ec7a48ae6f63c39483592139dfc" exitCode=0 Apr 17 07:51:55.163924 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:55.163076 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerDied","Data":"f52400e3fdddeb87fcc357c5767d358e30996ec7a48ae6f63c39483592139dfc"} Apr 17 07:51:56.039725 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:56.039687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:56.039899 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:56.039687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:56.039899 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:56.039810 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:51:56.039979 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:56.039904 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:56.039979 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:56.039697 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:56.040056 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:56.040008 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:51:58.039940 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:58.039726 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:51:58.039940 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:58.039918 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:51:58.040477 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:51:58.039874 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:51:58.040477 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:58.040002 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:51:58.040477 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:58.040102 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:51:58.040477 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:51:58.040197 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:52:00.038752 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:00.038719 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:52:00.039244 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:00.038835 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:52:00.039244 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:00.038868 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:52:00.039244 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:00.038830 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z7tsc" podUID="919fa45a-692a-4f75-a7ff-12f0085459ab" Apr 17 07:52:00.039244 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:00.038911 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mzn5t" podUID="a1d097b8-d651-4c5a-aee8-e970c942c7bd" Apr 17 07:52:00.039244 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:00.038989 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:52:01.378060 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.378035 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-228.ec2.internal" event="NodeReady" Apr 17 07:52:01.378528 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.378163 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:52:01.423382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.423347 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z5xmb"] Apr 17 07:52:01.438600 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.438571 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-62zbn"] Apr 17 07:52:01.438755 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.438737 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.440829 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.440807 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:52:01.440949 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.440810 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l9k4x\"" Apr 17 07:52:01.441261 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.441245 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:52:01.453616 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.453595 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-62zbn"] Apr 17 07:52:01.453616 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.453619 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z5xmb"] Apr 17 07:52:01.453738 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.453700 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:01.455790 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.455771 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:52:01.455790 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.455785 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:52:01.455956 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.455773 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:52:01.455956 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.455932 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tgj8b\"" Apr 17 07:52:01.510712 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.510673 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b550782-b0e2-4efb-9013-806a1ec8d616-tmp-dir\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.510884 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.510779 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b550782-b0e2-4efb-9013-806a1ec8d616-config-volume\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.510884 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.510832 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w2n9\" (UniqueName: \"kubernetes.io/projected/7b550782-b0e2-4efb-9013-806a1ec8d616-kube-api-access-8w2n9\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.510884 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.510855 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.611551 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.611475 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.611551 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.611520 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b550782-b0e2-4efb-9013-806a1ec8d616-tmp-dir\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.611706 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.611617 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:01.611706 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.611672 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:02.111656627 +0000 UTC m=+34.600537757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:52:01.611706 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.611697 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:01.611818 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.611752 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zf7v\" (UniqueName: \"kubernetes.io/projected/66caf165-b357-465a-87dc-24e5229f236e-kube-api-access-6zf7v\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:01.611818 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.611797 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b550782-b0e2-4efb-9013-806a1ec8d616-config-volume\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.611903 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.611823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w2n9\" (UniqueName: \"kubernetes.io/projected/7b550782-b0e2-4efb-9013-806a1ec8d616-kube-api-access-8w2n9\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.611903 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.611886 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b550782-b0e2-4efb-9013-806a1ec8d616-tmp-dir\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.612274 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.612256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b550782-b0e2-4efb-9013-806a1ec8d616-config-volume\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.634672 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.634642 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w2n9\" (UniqueName: \"kubernetes.io/projected/7b550782-b0e2-4efb-9013-806a1ec8d616-kube-api-access-8w2n9\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:01.712721 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.712686 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:01.712885 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.712744 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zf7v\" (UniqueName: \"kubernetes.io/projected/66caf165-b357-465a-87dc-24e5229f236e-kube-api-access-6zf7v\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:01.712885 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.712773 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:52:01.712885 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.712843 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:01.712885 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.712880 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:01.713011 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.712908 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:52:02.212891325 +0000 UTC m=+34.701772469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:52:01.713011 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.712925 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:33.712916447 +0000 UTC m=+66.201797578 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:01.721607 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.721578 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zf7v\" (UniqueName: \"kubernetes.io/projected/66caf165-b357-465a-87dc-24e5229f236e-kube-api-access-6zf7v\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:01.914385 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:01.914298 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:52:01.914554 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.914429 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:01.914554 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.914447 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:01.914554 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.914461 2565 projected.go:194] Error preparing data for projected volume kube-api-access-6l74r for pod openshift-network-diagnostics/network-check-target-z7tsc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:01.914554 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:01.914521 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r podName:919fa45a-692a-4f75-a7ff-12f0085459ab nodeName:}" failed. No retries permitted until 2026-04-17 07:52:33.914503733 +0000 UTC m=+66.403384866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6l74r" (UniqueName: "kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r") pod "network-check-target-z7tsc" (UID: "919fa45a-692a-4f75-a7ff-12f0085459ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:02.039318 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.039276 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:52:02.039513 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.039276 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:52:02.039513 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.039276 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:52:02.042437 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.042413 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:02.042561 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.042420 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:02.042561 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.042505 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqz2q\"" Apr 17 07:52:02.042674 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.042559 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:02.042987 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.042968 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:52:02.043149 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.043131 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4xllf\"" Apr 17 07:52:02.116061 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.116027 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:02.116206 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:02.116174 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:02.116273 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:02.116260 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:03.116244564 +0000 UTC m=+35.605125694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:52:02.182547 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.182467 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e100a52-e772-4d62-a573-6f5b62a4671d" containerID="6f47f404476e55772680549d15fc12dd4e8277e41984e0f6c3a9e09fd0a9afa6" exitCode=0 Apr 17 07:52:02.182547 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.182530 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerDied","Data":"6f47f404476e55772680549d15fc12dd4e8277e41984e0f6c3a9e09fd0a9afa6"} Apr 17 07:52:02.217095 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:02.217056 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:02.217297 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:02.217200 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:02.217363 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:02.217321 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:52:03.217299687 +0000 UTC m=+35.706180817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:52:03.023390 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.023355 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:52:03.026272 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.026252 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a1d097b8-d651-4c5a-aee8-e970c942c7bd-original-pull-secret\") pod \"global-pull-secret-syncer-mzn5t\" (UID: \"a1d097b8-d651-4c5a-aee8-e970c942c7bd\") " pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:52:03.123924 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.123883 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:03.124098 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:03.124038 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:03.124145 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:03.124104 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:05.124089037 +0000 UTC m=+37.612970171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:52:03.186631 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.186596 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e100a52-e772-4d62-a573-6f5b62a4671d" containerID="8d06b08c933ac78101929b9cd5dca9edc88c5bb8431709d3f821c2712d1fc499" exitCode=0 Apr 17 07:52:03.186779 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.186655 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerDied","Data":"8d06b08c933ac78101929b9cd5dca9edc88c5bb8431709d3f821c2712d1fc499"} Apr 17 07:52:03.224634 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.224447 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:03.224762 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:03.224594 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:03.224762 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:03.224759 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:52:05.224742319 +0000 UTC m=+37.713623452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:52:03.250937 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.250905 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mzn5t" Apr 17 07:52:03.432778 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:03.432749 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mzn5t"] Apr 17 07:52:03.436135 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:52:03.436102 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1d097b8_d651_4c5a_aee8_e970c942c7bd.slice/crio-9caef1f288244b2be053bbf9d0b69d9b0da841177a09b50e77ab539f5d3ae718 WatchSource:0}: Error finding container 9caef1f288244b2be053bbf9d0b69d9b0da841177a09b50e77ab539f5d3ae718: Status 404 returned error can't find the container with id 9caef1f288244b2be053bbf9d0b69d9b0da841177a09b50e77ab539f5d3ae718 Apr 17 07:52:04.190033 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:04.189995 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mzn5t" event={"ID":"a1d097b8-d651-4c5a-aee8-e970c942c7bd","Type":"ContainerStarted","Data":"9caef1f288244b2be053bbf9d0b69d9b0da841177a09b50e77ab539f5d3ae718"} Apr 17 07:52:04.193544 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:04.193509 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9htcw" event={"ID":"7e100a52-e772-4d62-a573-6f5b62a4671d","Type":"ContainerStarted","Data":"e9c4e6cab94d5213dcb0c522d35525f0821560f1cffbb7f9558d6b744d5ada7f"} Apr 17 07:52:04.214581 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:04.214523 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9htcw" podStartSLOduration=6.002131323 podStartE2EDuration="36.21450728s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:51:30.798541755 +0000 UTC m=+3.287422897" lastFinishedPulling="2026-04-17 07:52:01.01091771 +0000 UTC m=+33.499798854" observedRunningTime="2026-04-17 07:52:04.213090372 +0000 UTC m=+36.701971537" watchObservedRunningTime="2026-04-17 07:52:04.21450728 +0000 UTC m=+36.703388431" Apr 17 07:52:05.140609 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:05.140575 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:05.140794 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:05.140752 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:05.140862 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:05.140829 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:09.1408057 +0000 UTC m=+41.629686830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:52:05.240897 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:05.240861 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:05.241358 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:05.241334 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:05.241434 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:05.241427 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:52:09.241404834 +0000 UTC m=+41.730285967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:52:08.203351 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:08.203140 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mzn5t" event={"ID":"a1d097b8-d651-4c5a-aee8-e970c942c7bd","Type":"ContainerStarted","Data":"e4529333989970330e33f8a23c8efc56129860120323b6fdcd118361145dca83"} Apr 17 07:52:08.217170 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:08.217122 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mzn5t" podStartSLOduration=17.364703875 podStartE2EDuration="21.217106239s" podCreationTimestamp="2026-04-17 07:51:47 +0000 UTC" firstStartedPulling="2026-04-17 07:52:03.438001332 +0000 UTC m=+35.926882462" lastFinishedPulling="2026-04-17 07:52:07.290403682 +0000 UTC m=+39.779284826" observedRunningTime="2026-04-17 07:52:08.216781538 +0000 UTC m=+40.705662701" watchObservedRunningTime="2026-04-17 07:52:08.217106239 +0000 UTC m=+40.705987391" Apr 17 07:52:09.168518 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:09.168475 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:09.168687 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:09.168634 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:09.168726 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:09.168702 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:17.168686626 +0000 UTC m=+49.657567757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:52:09.269234 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:09.269171 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:09.269619 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:09.269329 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:09.269619 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:09.269394 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:52:17.269379089 +0000 UTC m=+49.758260218 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:52:17.224185 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:17.224146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:17.224670 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:17.224271 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:17.224670 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:17.224321 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:33.224306624 +0000 UTC m=+65.713187754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:52:17.324658 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:17.324625 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:17.324805 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:17.324784 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:17.324890 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:17.324879 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:52:33.324858016 +0000 UTC m=+65.813739160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:52:26.175467 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:26.175439 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pqdwt" Apr 17 07:52:33.236868 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.236810 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:52:33.237370 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:33.236979 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:33.237370 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:33.237062 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:05.237043389 +0000 UTC m=+97.725924519 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:52:33.337397 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.337341 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:52:33.337571 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:33.337517 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:33.337617 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:33.337598 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:53:05.337581391 +0000 UTC m=+97.826462520 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:52:33.739294 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.739250 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:52:33.741952 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.741933 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:33.750161 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:33.750133 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:33.750268 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:52:33.750234 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:37.750193079 +0000 UTC m=+130.239074209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : secret "metrics-daemon-secret" not found Apr 17 07:52:33.940854 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.940819 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:52:33.943459 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.943437 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:33.953088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.953067 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:33.965082 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:33.965054 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l74r\" (UniqueName: \"kubernetes.io/projected/919fa45a-692a-4f75-a7ff-12f0085459ab-kube-api-access-6l74r\") pod \"network-check-target-z7tsc\" (UID: \"919fa45a-692a-4f75-a7ff-12f0085459ab\") " pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:52:34.159330 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:34.159241 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4xllf\"" Apr 17 07:52:34.167237 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:34.167187 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:52:34.300402 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:34.300373 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z7tsc"] Apr 17 07:52:34.303792 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:52:34.303756 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919fa45a_692a_4f75_a7ff_12f0085459ab.slice/crio-1c11d59a88ed505c61d6090bc318fc0eb68c9be094bc090cd8730467a0571cfb WatchSource:0}: Error finding container 1c11d59a88ed505c61d6090bc318fc0eb68c9be094bc090cd8730467a0571cfb: Status 404 returned error can't find the container with id 1c11d59a88ed505c61d6090bc318fc0eb68c9be094bc090cd8730467a0571cfb Apr 17 07:52:35.256657 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:35.256591 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z7tsc" event={"ID":"919fa45a-692a-4f75-a7ff-12f0085459ab","Type":"ContainerStarted","Data":"1c11d59a88ed505c61d6090bc318fc0eb68c9be094bc090cd8730467a0571cfb"} Apr 17 07:52:37.261362 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:37.261324 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z7tsc" event={"ID":"919fa45a-692a-4f75-a7ff-12f0085459ab","Type":"ContainerStarted","Data":"59e1b9f33707619a95ee7299ddcb5d7766938f98deb356bc146a9b509a807302"} Apr 17 07:52:37.261741 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:37.261532 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:52:37.276580 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:52:37.276532 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z7tsc" podStartSLOduration=66.630762601 podStartE2EDuration="1m9.276517236s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:52:34.305616847 +0000 UTC m=+66.794497992" lastFinishedPulling="2026-04-17 07:52:36.951371481 +0000 UTC m=+69.440252627" observedRunningTime="2026-04-17 07:52:37.276252602 +0000 UTC m=+69.765133751" watchObservedRunningTime="2026-04-17 07:52:37.276517236 +0000 UTC m=+69.765398434" Apr 17 07:53:05.243461 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:53:05.243415 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:53:05.243898 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:53:05.243557 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:05.243898 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:53:05.243633 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:09.243617502 +0000 UTC m=+161.732498632 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:53:05.343956 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:53:05.343920 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:53:05.344095 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:53:05.344038 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:05.344140 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:53:05.344110 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:54:09.344092052 +0000 UTC m=+161.832973186 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:53:08.266394 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:53:08.266363 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z7tsc" Apr 17 07:53:37.767397 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:53:37.767355 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:53:37.767886 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:53:37.767471 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:37.767886 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:53:37.767526 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs podName:63918c32-1f1d-43f2-9243-76c8cb35d556 nodeName:}" failed. No retries permitted until 2026-04-17 07:55:39.767511931 +0000 UTC m=+252.256393061 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs") pod "network-metrics-daemon-k6mnq" (UID: "63918c32-1f1d-43f2-9243-76c8cb35d556") : secret "metrics-daemon-secret" not found Apr 17 07:54:01.106900 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.106866 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb"] Apr 17 07:54:01.108831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.108809 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:01.110965 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.110943 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.110965 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.110953 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.111144 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.111056 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 07:54:01.111797 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.111779 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-gbk8m\"" Apr 17 07:54:01.119467 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.119445 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb"] Apr 17 07:54:01.208139 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.208105 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g"] Apr 17 07:54:01.209923 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.209907 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-779d7cd6d-kbl8s"] Apr 17 07:54:01.210074 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.210057 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:01.211554 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.211538 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.212178 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.212164 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z94wh\"" Apr 17 07:54:01.212178 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.212172 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 07:54:01.212366 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.212353 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 07:54:01.214064 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.214047 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.214404 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.214375 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 07:54:01.214491 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.214439 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 07:54:01.214547 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.214501 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.214547 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.214505 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 07:54:01.214547 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.214524 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 07:54:01.214795 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.214781 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-zqr6b\"" Apr 17 07:54:01.226001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.224202 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g"] Apr 17 07:54:01.229241 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.229200 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-779d7cd6d-kbl8s"] Apr 17 07:54:01.232193 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.232165 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m66jp\" (UniqueName: \"kubernetes.io/projected/94325974-a593-47a2-94a2-4383b50e341c-kube-api-access-m66jp\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:01.232314 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.232266 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:01.307823 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.307788 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm"] Apr 17 07:54:01.309616 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.309592 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.310816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.310766 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qlgcs"] Apr 17 07:54:01.311830 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.311804 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 07:54:01.311830 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.311820 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 07:54:01.312003 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.311815 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.312135 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.312117 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.312203 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.312117 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-dq6hf\"" Apr 17 07:54:01.312648 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.312634 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.314706 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.314686 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6"] Apr 17 07:54:01.316372 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.316355 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.318322 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.318303 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-hldj9\"" Apr 17 07:54:01.318425 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.318304 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 07:54:01.318425 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.318378 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.318425 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.318395 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qgqnh\"" Apr 17 07:54:01.318689 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.318667 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 07:54:01.318772 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.318756 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.319001 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.318987 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:01.319391 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.319375 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:01.319440 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.319421 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 07:54:01.320048 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.320032 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 07:54:01.322651 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.322630 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm"] Apr 17 07:54:01.325924 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.325904 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qlgcs"] Apr 17 07:54:01.327165 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.327143 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 07:54:01.328180 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.328157 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6"] Apr 17 07:54:01.332985 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.332949 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m66jp\" (UniqueName: \"kubernetes.io/projected/94325974-a593-47a2-94a2-4383b50e341c-kube-api-access-m66jp\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:01.333110 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.332996 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-default-certificate\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.333110 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.333021 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfnk8\" (UniqueName: \"kubernetes.io/projected/d999a14b-e053-4d0b-8b72-526fefe663ca-kube-api-access-kfnk8\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.333110 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.333066 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.333285 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.333162 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:01.333285 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.333204 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-stats-auth\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.333285 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.333256 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:01.333285 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.333272 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.333473 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.333311 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:01.333473 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.333345 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls podName:94325974-a593-47a2-94a2-4383b50e341c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:01.833311829 +0000 UTC m=+154.322192972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-skwnb" (UID: "94325974-a593-47a2-94a2-4383b50e341c") : secret "samples-operator-tls" not found Apr 17 07:54:01.333473 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.333389 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:01.345403 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.345373 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m66jp\" (UniqueName: \"kubernetes.io/projected/94325974-a593-47a2-94a2-4383b50e341c-kube-api-access-m66jp\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:01.434567 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434482 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/27196395-61d5-4866-b7d2-ebf227547861-snapshots\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.434567 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434522 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.434567 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434543 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hklbl\" (UniqueName: \"kubernetes.io/projected/fd3831e0-1641-4a43-be1b-520e1a334313-kube-api-access-hklbl\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.434787 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434612 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:01.434787 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:01.434787 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434718 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.434787 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.434724 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:54:01.434787 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434742 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-default-certificate\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.434787 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434762 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mfk\" (UniqueName: \"kubernetes.io/projected/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-kube-api-access-j6mfk\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.435067 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.434795 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert podName:b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:01.934773199 +0000 UTC m=+154.423654332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfd6g" (UID: "b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3") : secret "networking-console-plugin-cert" not found Apr 17 07:54:01.435067 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434917 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbp6\" (UniqueName: \"kubernetes.io/projected/27196395-61d5-4866-b7d2-ebf227547861-kube-api-access-mxbp6\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.435067 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434940 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.435067 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.434973 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-stats-auth\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.435293 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.435092 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:01.93507086 +0000 UTC m=+154.423951996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:01.435293 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.435293 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435174 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27196395-61d5-4866-b7d2-ebf227547861-service-ca-bundle\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.435293 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435231 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27196395-61d5-4866-b7d2-ebf227547861-serving-cert\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.435293 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.435236 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:01.435293 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435281 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfnk8\" (UniqueName: \"kubernetes.io/projected/d999a14b-e053-4d0b-8b72-526fefe663ca-kube-api-access-kfnk8\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.435576 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.435311 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:01.935294193 +0000 UTC m=+154.424175337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : secret "router-metrics-certs-default" not found Apr 17 07:54:01.435576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435339 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27196395-61d5-4866-b7d2-ebf227547861-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.435576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435379 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3831e0-1641-4a43-be1b-520e1a334313-config\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.435576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435416 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3831e0-1641-4a43-be1b-520e1a334313-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.435576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:01.435576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.435461 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27196395-61d5-4866-b7d2-ebf227547861-tmp\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.437251 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.437229 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-stats-auth\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.437345 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.437313 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-default-certificate\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.443460 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.443439 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfnk8\" (UniqueName: \"kubernetes.io/projected/d999a14b-e053-4d0b-8b72-526fefe663ca-kube-api-access-kfnk8\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.536468 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536429 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mfk\" (UniqueName: \"kubernetes.io/projected/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-kube-api-access-j6mfk\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.536468 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536466 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbp6\" (UniqueName: \"kubernetes.io/projected/27196395-61d5-4866-b7d2-ebf227547861-kube-api-access-mxbp6\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.536683 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536633 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27196395-61d5-4866-b7d2-ebf227547861-service-ca-bundle\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.536683 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536662 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27196395-61d5-4866-b7d2-ebf227547861-serving-cert\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.536756 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536689 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27196395-61d5-4866-b7d2-ebf227547861-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.536756 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3831e0-1641-4a43-be1b-520e1a334313-config\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.536756 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536750 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3831e0-1641-4a43-be1b-520e1a334313-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.536872 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536804 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27196395-61d5-4866-b7d2-ebf227547861-tmp\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.536872 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536840 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/27196395-61d5-4866-b7d2-ebf227547861-snapshots\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.536872 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536865 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.537020 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536888 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hklbl\" (UniqueName: \"kubernetes.io/projected/fd3831e0-1641-4a43-be1b-520e1a334313-kube-api-access-hklbl\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.537020 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.536938 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.537348 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.537310 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/27196395-61d5-4866-b7d2-ebf227547861-tmp\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.537458 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.537419 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd3831e0-1641-4a43-be1b-520e1a334313-config\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.537518 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.537495 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27196395-61d5-4866-b7d2-ebf227547861-service-ca-bundle\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.537733 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.537704 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27196395-61d5-4866-b7d2-ebf227547861-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.537859 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.537838 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.537952 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.537934 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/27196395-61d5-4866-b7d2-ebf227547861-snapshots\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.539463 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.539443 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27196395-61d5-4866-b7d2-ebf227547861-serving-cert\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.539566 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.539523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd3831e0-1641-4a43-be1b-520e1a334313-serving-cert\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.539566 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.539541 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.545118 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.545098 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mfk\" (UniqueName: \"kubernetes.io/projected/cdcde5b0-5ef1-4fd5-b0b7-de55988110a6-kube-api-access-j6mfk\") pod \"kube-storage-version-migrator-operator-6769c5d45-qfdj6\" (UID: \"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.545234 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.545165 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbp6\" (UniqueName: \"kubernetes.io/projected/27196395-61d5-4866-b7d2-ebf227547861-kube-api-access-mxbp6\") pod \"insights-operator-585dfdc468-qlgcs\" (UID: \"27196395-61d5-4866-b7d2-ebf227547861\") " pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.545234 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.545202 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hklbl\" (UniqueName: \"kubernetes.io/projected/fd3831e0-1641-4a43-be1b-520e1a334313-kube-api-access-hklbl\") pod \"service-ca-operator-d6fc45fc5-hsdfm\" (UID: \"fd3831e0-1641-4a43-be1b-520e1a334313\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.620249 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.620195 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" Apr 17 07:54:01.628070 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.628033 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qlgcs" Apr 17 07:54:01.633897 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.633865 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" Apr 17 07:54:01.759464 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.759417 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm"] Apr 17 07:54:01.763313 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:01.763274 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd3831e0_1641_4a43_be1b_520e1a334313.slice/crio-35f336e5f642ef1d59a7858e241908f269d609880efc170fbac61b95bdddf791 WatchSource:0}: Error finding container 35f336e5f642ef1d59a7858e241908f269d609880efc170fbac61b95bdddf791: Status 404 returned error can't find the container with id 35f336e5f642ef1d59a7858e241908f269d609880efc170fbac61b95bdddf791 Apr 17 07:54:01.771010 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.770984 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qlgcs"] Apr 17 07:54:01.774159 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:01.774131 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27196395_61d5_4866_b7d2_ebf227547861.slice/crio-af0163a69f200ea732a11bf2ff31b4ab7d044d3feaf4145cb768f28d0d534ce1 WatchSource:0}: Error finding container af0163a69f200ea732a11bf2ff31b4ab7d044d3feaf4145cb768f28d0d534ce1: Status 404 returned error can't find the container with id af0163a69f200ea732a11bf2ff31b4ab7d044d3feaf4145cb768f28d0d534ce1 Apr 17 07:54:01.786305 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.786276 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6"] Apr 17 07:54:01.790441 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:01.790415 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcde5b0_5ef1_4fd5_b0b7_de55988110a6.slice/crio-72c155696e7b533e65228c5aee21ca71548734260a2bf2d1d4f5da63599dc241 WatchSource:0}: Error finding container 72c155696e7b533e65228c5aee21ca71548734260a2bf2d1d4f5da63599dc241: Status 404 returned error can't find the container with id 72c155696e7b533e65228c5aee21ca71548734260a2bf2d1d4f5da63599dc241 Apr 17 07:54:01.839768 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.839730 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:01.839949 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.839889 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:01.839991 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.839960 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls podName:94325974-a593-47a2-94a2-4383b50e341c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.839941902 +0000 UTC m=+155.328823033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-skwnb" (UID: "94325974-a593-47a2-94a2-4383b50e341c") : secret "samples-operator-tls" not found Apr 17 07:54:01.941034 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.940981 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:01.941284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.941056 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.941284 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.941134 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:54:01.941284 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.941204 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert podName:b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.941186951 +0000 UTC m=+155.430068088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfd6g" (UID: "b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3") : secret "networking-console-plugin-cert" not found Apr 17 07:54:01.941284 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:01.941246 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:01.941284 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.941264 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.941251211 +0000 UTC m=+155.430132346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:01.941486 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.941310 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:01.941486 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:01.941341 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.941331486 +0000 UTC m=+155.430212624 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : secret "router-metrics-certs-default" not found Apr 17 07:54:02.433156 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:02.433082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" event={"ID":"fd3831e0-1641-4a43-be1b-520e1a334313","Type":"ContainerStarted","Data":"35f336e5f642ef1d59a7858e241908f269d609880efc170fbac61b95bdddf791"} Apr 17 07:54:02.434931 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:02.434893 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" event={"ID":"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6","Type":"ContainerStarted","Data":"72c155696e7b533e65228c5aee21ca71548734260a2bf2d1d4f5da63599dc241"} Apr 17 07:54:02.436591 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:02.436561 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlgcs" event={"ID":"27196395-61d5-4866-b7d2-ebf227547861","Type":"ContainerStarted","Data":"af0163a69f200ea732a11bf2ff31b4ab7d044d3feaf4145cb768f28d0d534ce1"} Apr 17 07:54:02.849418 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:02.849377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:02.849640 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:02.849612 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:02.849821 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:02.849708 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls podName:94325974-a593-47a2-94a2-4383b50e341c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:04.849676257 +0000 UTC m=+157.338557391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-skwnb" (UID: "94325974-a593-47a2-94a2-4383b50e341c") : secret "samples-operator-tls" not found Apr 17 07:54:02.950315 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:02.950279 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:02.950516 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:02.950386 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:02.950516 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:02.950440 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:02.950638 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:02.950594 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:04.950574743 +0000 UTC m=+157.439455886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:02.950962 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:02.950840 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:54:02.950962 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:02.950878 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:02.950962 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:02.950917 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert podName:b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:04.950897173 +0000 UTC m=+157.439778311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfd6g" (UID: "b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3") : secret "networking-console-plugin-cert" not found Apr 17 07:54:02.950962 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:02.950963 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:04.950943119 +0000 UTC m=+157.439824254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : secret "router-metrics-certs-default" not found Apr 17 07:54:04.449694 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.449658 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-z5xmb" podUID="7b550782-b0e2-4efb-9013-806a1ec8d616" Apr 17 07:54:04.462198 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.462170 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-62zbn" podUID="66caf165-b357-465a-87dc-24e5229f236e" Apr 17 07:54:04.868817 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:04.868771 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:04.868992 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.868967 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:04.869066 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.869056 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls podName:94325974-a593-47a2-94a2-4383b50e341c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:08.869032357 +0000 UTC m=+161.357913489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-skwnb" (UID: "94325974-a593-47a2-94a2-4383b50e341c") : secret "samples-operator-tls" not found Apr 17 07:54:04.969411 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:04.969365 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:04.969581 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:04.969464 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:04.969581 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.969566 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:08.96954269 +0000 UTC m=+161.458423826 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:04.969687 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:04.969609 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:04.969687 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.969649 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:04.969753 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.969711 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:08.969693186 +0000 UTC m=+161.458574324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : secret "router-metrics-certs-default" not found Apr 17 07:54:04.969799 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.969752 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:54:04.969799 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:04.969788 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert podName:b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:08.969776244 +0000 UTC m=+161.458657381 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfd6g" (UID: "b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3") : secret "networking-console-plugin-cert" not found Apr 17 07:54:05.061954 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:05.061905 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-k6mnq" podUID="63918c32-1f1d-43f2-9243-76c8cb35d556" Apr 17 07:54:05.446695 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:05.446639 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlgcs" event={"ID":"27196395-61d5-4866-b7d2-ebf227547861","Type":"ContainerStarted","Data":"9f6cd51dcbcecc3d5e42ef1fe2e8170090cf61288a7ea6e45a47afbad300c729"} Apr 17 07:54:05.448083 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:05.448045 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" event={"ID":"fd3831e0-1641-4a43-be1b-520e1a334313","Type":"ContainerStarted","Data":"776e38abd0b64f6bc7df5f44762c68add474c79ac8abb2494d9c3f9abde1eb2a"} Apr 17 07:54:05.450918 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:05.450546 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" event={"ID":"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6","Type":"ContainerStarted","Data":"6056563c926bfc7ba04029939e13e77dedb6a0ea425d0977a064f98bc41233ee"} Apr 17 07:54:05.450918 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:05.450650 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z5xmb" Apr 17 07:54:05.508466 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:05.508409 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-qlgcs" podStartSLOduration=1.884844578 podStartE2EDuration="4.508394107s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:01.775960774 +0000 UTC m=+154.264841904" lastFinishedPulling="2026-04-17 07:54:04.399510291 +0000 UTC m=+156.888391433" observedRunningTime="2026-04-17 07:54:05.499879417 +0000 UTC m=+157.988760581" watchObservedRunningTime="2026-04-17 07:54:05.508394107 +0000 UTC m=+157.997275256" Apr 17 07:54:05.544096 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:05.544048 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" podStartSLOduration=1.932664841 podStartE2EDuration="4.544025139s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:01.792355167 +0000 UTC m=+154.281236297" lastFinishedPulling="2026-04-17 07:54:04.403715463 +0000 UTC m=+156.892596595" observedRunningTime="2026-04-17 07:54:05.543416635 +0000 UTC m=+158.032297788" watchObservedRunningTime="2026-04-17 07:54:05.544025139 +0000 UTC m=+158.032906290" Apr 17 07:54:05.589482 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:05.589421 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" podStartSLOduration=1.9479096870000001 podStartE2EDuration="4.589405065s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:01.764905326 +0000 UTC m=+154.253786456" lastFinishedPulling="2026-04-17 07:54:04.406400701 +0000 UTC m=+156.895281834" observedRunningTime="2026-04-17 07:54:05.588326392 +0000 UTC m=+158.077207568" watchObservedRunningTime="2026-04-17 07:54:05.589405065 +0000 UTC m=+158.078286217" Apr 17 07:54:07.439687 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:07.439608 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wcz5n_f2e063f6-7071-4323-a26c-9d5f28ce786e/dns-node-resolver/0.log" Apr 17 07:54:08.441162 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:08.441137 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mtwj6_b2e0178e-22bf-4ec0-8752-a62c91d1d7a5/node-ca/0.log" Apr 17 07:54:08.903632 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:08.903592 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:08.903815 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:08.903748 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 07:54:08.903815 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:08.903808 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls podName:94325974-a593-47a2-94a2-4383b50e341c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:16.903789769 +0000 UTC m=+169.392670903 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-skwnb" (UID: "94325974-a593-47a2-94a2-4383b50e341c") : secret "samples-operator-tls" not found Apr 17 07:54:09.004197 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:09.004161 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:09.004376 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:09.004235 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:09.004376 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.004355 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:17.004337813 +0000 UTC m=+169.493218944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:09.004452 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.004426 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:54:09.004452 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:09.004439 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:09.004508 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.004479 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert podName:b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:17.004465086 +0000 UTC m=+169.493346216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfd6g" (UID: "b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3") : secret "networking-console-plugin-cert" not found Apr 17 07:54:09.004549 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.004513 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:54:09.004581 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.004554 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:17.004543696 +0000 UTC m=+169.493424826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : secret "router-metrics-certs-default" not found Apr 17 07:54:09.307355 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:09.307314 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:54:09.307535 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.307466 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:54:09.307535 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.307528 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls podName:7b550782-b0e2-4efb-9013-806a1ec8d616 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:11.307512994 +0000 UTC m=+283.796394125 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls") pod "dns-default-z5xmb" (UID: "7b550782-b0e2-4efb-9013-806a1ec8d616") : secret "dns-default-metrics-tls" not found Apr 17 07:54:09.408168 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:09.408131 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:54:09.408359 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.408273 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:54:09.408359 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:09.408325 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert podName:66caf165-b357-465a-87dc-24e5229f236e nodeName:}" failed. No retries permitted until 2026-04-17 07:56:11.408309888 +0000 UTC m=+283.897191018 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert") pod "ingress-canary-62zbn" (UID: "66caf165-b357-465a-87dc-24e5229f236e") : secret "canary-serving-cert" not found Apr 17 07:54:16.970353 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:16.970312 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:16.972657 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:16.972636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/94325974-a593-47a2-94a2-4383b50e341c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-skwnb\" (UID: \"94325974-a593-47a2-94a2-4383b50e341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:17.018260 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:17.018186 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" Apr 17 07:54:17.071257 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:17.071203 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:17.071407 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:17.071338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:17.071407 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:17.071393 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:17.071513 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:17.071491 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 07:54:17.071562 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:17.071542 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle podName:d999a14b-e053-4d0b-8b72-526fefe663ca nodeName:}" failed. No retries permitted until 2026-04-17 07:54:33.071523446 +0000 UTC m=+185.560404580 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle") pod "router-default-779d7cd6d-kbl8s" (UID: "d999a14b-e053-4d0b-8b72-526fefe663ca") : configmap references non-existent config key: service-ca.crt Apr 17 07:54:17.071623 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:17.071567 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert podName:b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:33.071557767 +0000 UTC m=+185.560438898 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfd6g" (UID: "b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3") : secret "networking-console-plugin-cert" not found Apr 17 07:54:17.080296 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:17.080231 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d999a14b-e053-4d0b-8b72-526fefe663ca-metrics-certs\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:17.145877 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:17.145837 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb"] Apr 17 07:54:17.482167 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:17.482134 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" event={"ID":"94325974-a593-47a2-94a2-4383b50e341c","Type":"ContainerStarted","Data":"ef9462e5b77eea20d8d481bd0a2e29bca94bd0af221f9794c464f9610017b7ad"} Apr 17 07:54:18.041155 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:18.041124 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:54:19.039294 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:19.039263 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:54:19.488448 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:19.488417 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" event={"ID":"94325974-a593-47a2-94a2-4383b50e341c","Type":"ContainerStarted","Data":"431c5e57bfb30960041abef9342d6d8d2b045b2a1800acb782b4c7c98c302f67"} Apr 17 07:54:19.488448 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:19.488452 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" event={"ID":"94325974-a593-47a2-94a2-4383b50e341c","Type":"ContainerStarted","Data":"3aaaccf3c0e5769f407bf2aa46cdfab899cdfee29f54475feb8171d5f7deb9ee"} Apr 17 07:54:19.504964 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:19.504906 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-skwnb" podStartSLOduration=17.192859245 podStartE2EDuration="18.50489312s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:17.186795585 +0000 UTC m=+169.675676714" lastFinishedPulling="2026-04-17 07:54:18.498829454 +0000 UTC m=+170.987710589" observedRunningTime="2026-04-17 07:54:19.503655944 +0000 UTC m=+171.992537106" watchObservedRunningTime="2026-04-17 07:54:19.50489312 +0000 UTC m=+171.993774271" Apr 17 07:54:30.797804 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.797771 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr"] Apr 17 07:54:30.800824 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.800808 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.803047 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.803014 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6b9tl\"" Apr 17 07:54:30.803895 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.803876 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:54:30.803962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.803880 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:54:30.803962 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.803934 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:54:30.809156 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.809138 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:54:30.817290 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.817256 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr"] Apr 17 07:54:30.861552 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.861516 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v8457"] Apr 17 07:54:30.864837 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.864814 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:30.866982 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.866962 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-m2z7h\"" Apr 17 07:54:30.867158 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.867141 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:54:30.867269 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.867251 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:54:30.874993 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.874965 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v8457"] Apr 17 07:54:30.876506 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876486 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crk9d\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-kube-api-access-crk9d\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.876612 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876528 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-image-registry-private-configuration\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.876612 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876549 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-trusted-ca\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.876612 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876600 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-installation-pull-secrets\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.876748 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876647 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-ca-trust-extracted\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.876748 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876705 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-registry-tls\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.876852 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-registry-certificates\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.876852 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.876821 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-bound-sa-token\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.977891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.977839 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crk9d\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-kube-api-access-crk9d\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.977891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.977900 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-image-registry-private-configuration\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.978141 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.977984 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-trusted-ca\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.978141 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978019 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-installation-pull-secrets\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.978141 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978043 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-ca-trust-extracted\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.978141 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978078 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-registry-tls\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.978141 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978110 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dz6j\" (UniqueName: \"kubernetes.io/projected/5f74b0a4-4315-46f8-a669-be2646461e18-kube-api-access-9dz6j\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:30.978426 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978186 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5f74b0a4-4315-46f8-a669-be2646461e18-crio-socket\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:30.978426 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978240 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5f74b0a4-4315-46f8-a669-be2646461e18-data-volume\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:30.978426 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978343 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-registry-certificates\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.978426 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978375 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5f74b0a4-4315-46f8-a669-be2646461e18-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:30.978426 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978419 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-bound-sa-token\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.978651 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978446 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5f74b0a4-4315-46f8-a669-be2646461e18-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:30.978651 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.978587 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-ca-trust-extracted\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.979080 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.979053 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-trusted-ca\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.979176 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.979082 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-registry-certificates\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.980667 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.980639 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-image-registry-private-configuration\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.980882 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.980864 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-installation-pull-secrets\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.981029 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.981011 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-registry-tls\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.986101 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.986077 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crk9d\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-kube-api-access-crk9d\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:30.986345 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:30.986329 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77062eb3-1bc4-4732-a8e0-c5a9253bbac5-bound-sa-token\") pod \"image-registry-7fbcd8fb7b-hvmtr\" (UID: \"77062eb3-1bc4-4732-a8e0-c5a9253bbac5\") " pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:31.079686 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.079602 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dz6j\" (UniqueName: \"kubernetes.io/projected/5f74b0a4-4315-46f8-a669-be2646461e18-kube-api-access-9dz6j\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.079686 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.079670 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5f74b0a4-4315-46f8-a669-be2646461e18-crio-socket\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.079887 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.079698 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5f74b0a4-4315-46f8-a669-be2646461e18-data-volume\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.079887 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.079744 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5f74b0a4-4315-46f8-a669-be2646461e18-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.079887 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.079787 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5f74b0a4-4315-46f8-a669-be2646461e18-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.079887 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.079801 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5f74b0a4-4315-46f8-a669-be2646461e18-crio-socket\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.080078 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.080059 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5f74b0a4-4315-46f8-a669-be2646461e18-data-volume\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.080345 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.080323 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5f74b0a4-4315-46f8-a669-be2646461e18-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.082065 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.082035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5f74b0a4-4315-46f8-a669-be2646461e18-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.087595 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.087573 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dz6j\" (UniqueName: \"kubernetes.io/projected/5f74b0a4-4315-46f8-a669-be2646461e18-kube-api-access-9dz6j\") pod \"insights-runtime-extractor-v8457\" (UID: \"5f74b0a4-4315-46f8-a669-be2646461e18\") " pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.110089 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.110060 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:31.173665 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.173575 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v8457" Apr 17 07:54:31.236202 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.236170 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr"] Apr 17 07:54:31.241130 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:31.241104 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77062eb3_1bc4_4732_a8e0_c5a9253bbac5.slice/crio-a8044af99d84eccbf10641acbea522c7abb11fa0acf38d35d3fac187e304c826 WatchSource:0}: Error finding container a8044af99d84eccbf10641acbea522c7abb11fa0acf38d35d3fac187e304c826: Status 404 returned error can't find the container with id a8044af99d84eccbf10641acbea522c7abb11fa0acf38d35d3fac187e304c826 Apr 17 07:54:31.299240 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.299193 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v8457"] Apr 17 07:54:31.302581 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:31.302550 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f74b0a4_4315_46f8_a669_be2646461e18.slice/crio-3946e014eaf464c4c638789fd72cfc619bbf88fb87c5e2ad65f4711c41774147 WatchSource:0}: Error finding container 3946e014eaf464c4c638789fd72cfc619bbf88fb87c5e2ad65f4711c41774147: Status 404 returned error can't find the container with id 3946e014eaf464c4c638789fd72cfc619bbf88fb87c5e2ad65f4711c41774147 Apr 17 07:54:31.523307 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.523254 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" event={"ID":"77062eb3-1bc4-4732-a8e0-c5a9253bbac5","Type":"ContainerStarted","Data":"d1fbb190a2129017374541b4d8f56e81c8aeda46eff2955c4704658c5a09b25b"} Apr 17 07:54:31.523307 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.523299 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" event={"ID":"77062eb3-1bc4-4732-a8e0-c5a9253bbac5","Type":"ContainerStarted","Data":"a8044af99d84eccbf10641acbea522c7abb11fa0acf38d35d3fac187e304c826"} Apr 17 07:54:31.523505 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.523393 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:31.524621 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.524598 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8457" event={"ID":"5f74b0a4-4315-46f8-a669-be2646461e18","Type":"ContainerStarted","Data":"b6c660791074c3f5463cb8469c100905ae8670bc0cb1afd61818ccc010797ba2"} Apr 17 07:54:31.524744 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.524624 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8457" event={"ID":"5f74b0a4-4315-46f8-a669-be2646461e18","Type":"ContainerStarted","Data":"3946e014eaf464c4c638789fd72cfc619bbf88fb87c5e2ad65f4711c41774147"} Apr 17 07:54:31.541355 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:31.541309 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" podStartSLOduration=1.541293434 podStartE2EDuration="1.541293434s" podCreationTimestamp="2026-04-17 07:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:31.540440098 +0000 UTC m=+184.029321249" watchObservedRunningTime="2026-04-17 07:54:31.541293434 +0000 UTC m=+184.030174586" Apr 17 07:54:32.533404 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:32.533363 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8457" event={"ID":"5f74b0a4-4315-46f8-a669-be2646461e18","Type":"ContainerStarted","Data":"8ab67a8cb53e25a1121c0cf361818514be40fc392b917d84b673f5a9ca2774fd"} Apr 17 07:54:33.098610 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.098561 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:33.098801 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.098725 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:33.099417 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.099380 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d999a14b-e053-4d0b-8b72-526fefe663ca-service-ca-bundle\") pod \"router-default-779d7cd6d-kbl8s\" (UID: \"d999a14b-e053-4d0b-8b72-526fefe663ca\") " pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:33.101240 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.101202 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfd6g\" (UID: \"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:33.322895 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.322871 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z94wh\"" Apr 17 07:54:33.330534 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.330515 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" Apr 17 07:54:33.330700 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.330682 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-zqr6b\"" Apr 17 07:54:33.339692 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.339653 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:33.470797 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.470765 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g"] Apr 17 07:54:33.474816 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:33.474788 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2cc8f03_c0c9_465b_b46c_f4c6b89b56e3.slice/crio-325507d421593dcd67a5b49d3d7dd15876896fbf24be53be06755e1e278fdf3c WatchSource:0}: Error finding container 325507d421593dcd67a5b49d3d7dd15876896fbf24be53be06755e1e278fdf3c: Status 404 returned error can't find the container with id 325507d421593dcd67a5b49d3d7dd15876896fbf24be53be06755e1e278fdf3c Apr 17 07:54:33.492808 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.492778 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-779d7cd6d-kbl8s"] Apr 17 07:54:33.495649 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:33.495624 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd999a14b_e053_4d0b_8b72_526fefe663ca.slice/crio-110bfe117157de38b2cf82e87ba26597ea94154ee874a9c6fbb72028930f4725 WatchSource:0}: Error finding container 110bfe117157de38b2cf82e87ba26597ea94154ee874a9c6fbb72028930f4725: Status 404 returned error can't find the container with id 110bfe117157de38b2cf82e87ba26597ea94154ee874a9c6fbb72028930f4725 Apr 17 07:54:33.538597 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.538560 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v8457" event={"ID":"5f74b0a4-4315-46f8-a669-be2646461e18","Type":"ContainerStarted","Data":"b43597dc3c1492d400fd5d3403599253c3b844304b91f2d0ef128922a8ee9069"} Apr 17 07:54:33.539661 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.539639 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" event={"ID":"d999a14b-e053-4d0b-8b72-526fefe663ca","Type":"ContainerStarted","Data":"110bfe117157de38b2cf82e87ba26597ea94154ee874a9c6fbb72028930f4725"} Apr 17 07:54:33.540512 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.540492 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" event={"ID":"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3","Type":"ContainerStarted","Data":"325507d421593dcd67a5b49d3d7dd15876896fbf24be53be06755e1e278fdf3c"} Apr 17 07:54:33.558141 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:33.558095 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v8457" podStartSLOduration=1.641323555 podStartE2EDuration="3.558080676s" podCreationTimestamp="2026-04-17 07:54:30 +0000 UTC" firstStartedPulling="2026-04-17 07:54:31.358283 +0000 UTC m=+183.847164141" lastFinishedPulling="2026-04-17 07:54:33.275040118 +0000 UTC m=+185.763921262" observedRunningTime="2026-04-17 07:54:33.557030087 +0000 UTC m=+186.045911239" watchObservedRunningTime="2026-04-17 07:54:33.558080676 +0000 UTC m=+186.046961828" Apr 17 07:54:34.544531 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:34.544491 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" event={"ID":"b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3","Type":"ContainerStarted","Data":"40ef1b5b0b783bbaf34bad70922d8b320840336b54c93e2bc63a70692c33a397"} Apr 17 07:54:34.545874 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:34.545844 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" event={"ID":"d999a14b-e053-4d0b-8b72-526fefe663ca","Type":"ContainerStarted","Data":"b23dfb0e3fccac622486c771077f1c1e4673bb7cdce246a72cccc0f76289cf94"} Apr 17 07:54:34.563169 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:34.563081 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfd6g" podStartSLOduration=32.726443703 podStartE2EDuration="33.56306282s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:33.476807174 +0000 UTC m=+185.965688305" lastFinishedPulling="2026-04-17 07:54:34.313426289 +0000 UTC m=+186.802307422" observedRunningTime="2026-04-17 07:54:34.562534626 +0000 UTC m=+187.051415779" watchObservedRunningTime="2026-04-17 07:54:34.56306282 +0000 UTC m=+187.051943973" Apr 17 07:54:34.587507 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:34.587454 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" podStartSLOduration=33.587437054 podStartE2EDuration="33.587437054s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:54:34.586580768 +0000 UTC m=+187.075461923" watchObservedRunningTime="2026-04-17 07:54:34.587437054 +0000 UTC m=+187.076318206" Apr 17 07:54:35.340156 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.340115 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:35.342924 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.342896 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:35.469826 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.469789 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc"] Apr 17 07:54:35.472945 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.472929 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" Apr 17 07:54:35.475544 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.475525 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-79wwf\"" Apr 17 07:54:35.475654 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.475553 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 07:54:35.482748 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.482727 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc"] Apr 17 07:54:35.524343 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.524296 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eaa1d24a-6baa-4384-96d4-2aa7a6b5734a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pf7jc\" (UID: \"eaa1d24a-6baa-4384-96d4-2aa7a6b5734a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" Apr 17 07:54:35.549862 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.549829 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:35.551073 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.551051 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-779d7cd6d-kbl8s" Apr 17 07:54:35.625792 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.625683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eaa1d24a-6baa-4384-96d4-2aa7a6b5734a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pf7jc\" (UID: \"eaa1d24a-6baa-4384-96d4-2aa7a6b5734a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" Apr 17 07:54:35.628161 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.628140 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eaa1d24a-6baa-4384-96d4-2aa7a6b5734a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pf7jc\" (UID: \"eaa1d24a-6baa-4384-96d4-2aa7a6b5734a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" Apr 17 07:54:35.781805 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.781760 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" Apr 17 07:54:35.903574 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:35.903494 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc"] Apr 17 07:54:35.906879 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:35.906848 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa1d24a_6baa_4384_96d4_2aa7a6b5734a.slice/crio-b524b75fe31d045948b33b39d78eb052ef6460c7172cd161ae510ecbdef7b01d WatchSource:0}: Error finding container b524b75fe31d045948b33b39d78eb052ef6460c7172cd161ae510ecbdef7b01d: Status 404 returned error can't find the container with id b524b75fe31d045948b33b39d78eb052ef6460c7172cd161ae510ecbdef7b01d Apr 17 07:54:36.553642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:36.553598 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" event={"ID":"eaa1d24a-6baa-4384-96d4-2aa7a6b5734a","Type":"ContainerStarted","Data":"b524b75fe31d045948b33b39d78eb052ef6460c7172cd161ae510ecbdef7b01d"} Apr 17 07:54:37.557296 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:37.557254 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" event={"ID":"eaa1d24a-6baa-4384-96d4-2aa7a6b5734a","Type":"ContainerStarted","Data":"492b727f447017a537f64d50032dd759eedbc02c35cd4d9cd8f40c274a043342"} Apr 17 07:54:37.557744 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:37.557538 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" Apr 17 07:54:37.562176 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:37.562152 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" Apr 17 07:54:37.574008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:37.573961 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pf7jc" podStartSLOduration=1.609547353 podStartE2EDuration="2.573948943s" podCreationTimestamp="2026-04-17 07:54:35 +0000 UTC" firstStartedPulling="2026-04-17 07:54:35.908851433 +0000 UTC m=+188.397732564" lastFinishedPulling="2026-04-17 07:54:36.873253021 +0000 UTC m=+189.362134154" observedRunningTime="2026-04-17 07:54:37.573320946 +0000 UTC m=+190.062202100" watchObservedRunningTime="2026-04-17 07:54:37.573948943 +0000 UTC m=+190.062830094" Apr 17 07:54:41.919777 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.919742 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9czgc"] Apr 17 07:54:41.924905 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.924878 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:41.927820 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.927784 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:54:41.927956 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.927785 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 07:54:41.927956 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.927873 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-lbkvx\"" Apr 17 07:54:41.928829 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.928809 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:54:41.928829 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.928825 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:54:41.928976 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.928867 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 07:54:41.928976 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.928877 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 07:54:41.935138 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.935118 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9czgc"] Apr 17 07:54:41.947711 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.947688 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gdn8b"] Apr 17 07:54:41.951100 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.951072 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:41.952993 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.952975 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:54:41.953134 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.953115 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:54:41.953224 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.953197 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:54:41.953295 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:41.953227 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-68tcw\"" Apr 17 07:54:42.080525 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080491 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.080525 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080530 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.080791 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080550 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vphsm\" (UniqueName: \"kubernetes.io/projected/cc81e894-4be4-42e1-8d62-b69a7b840a45-kube-api-access-vphsm\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.080791 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080568 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-accelerators-collector-config\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.080791 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080658 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/02065558-0ec1-4075-aac4-36ffc7ebb493-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.080791 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080702 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-root\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.080791 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080760 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02065558-0ec1-4075-aac4-36ffc7ebb493-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.080791 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080789 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-tls\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.081057 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080818 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-textfile\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.081057 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080843 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-wtmp\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.081057 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080877 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.081057 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080903 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-sys\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.081057 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.080978 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4tx\" (UniqueName: \"kubernetes.io/projected/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-api-access-gq4tx\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.081057 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.081001 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc81e894-4be4-42e1-8d62-b69a7b840a45-metrics-client-ca\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.081057 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.081015 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182281 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182172 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.182281 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182234 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.182281 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vphsm\" (UniqueName: \"kubernetes.io/projected/cc81e894-4be4-42e1-8d62-b69a7b840a45-kube-api-access-vphsm\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182281 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182277 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-accelerators-collector-config\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182306 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/02065558-0ec1-4075-aac4-36ffc7ebb493-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182325 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-root\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182360 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02065558-0ec1-4075-aac4-36ffc7ebb493-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182431 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-root\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182459 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-tls\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182493 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-textfile\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182519 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-wtmp\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182553 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.182577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182578 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-sys\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182643 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4tx\" (UniqueName: \"kubernetes.io/projected/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-api-access-gq4tx\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182665 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc81e894-4be4-42e1-8d62-b69a7b840a45-metrics-client-ca\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182689 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182726 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-sys\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182809 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/02065558-0ec1-4075-aac4-36ffc7ebb493-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182859 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-wtmp\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182871 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-textfile\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.183008 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.182960 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-accelerators-collector-config\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.183337 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.183064 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.183397 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.183340 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02065558-0ec1-4075-aac4-36ffc7ebb493-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.183472 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.183447 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc81e894-4be4-42e1-8d62-b69a7b840a45-metrics-client-ca\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.185010 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.184986 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.185137 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.185049 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.185504 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.185486 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cc81e894-4be4-42e1-8d62-b69a7b840a45-node-exporter-tls\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.185571 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.185493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.193043 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.193018 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vphsm\" (UniqueName: \"kubernetes.io/projected/cc81e894-4be4-42e1-8d62-b69a7b840a45-kube-api-access-vphsm\") pod \"node-exporter-gdn8b\" (UID: \"cc81e894-4be4-42e1-8d62-b69a7b840a45\") " pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.193258 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.193198 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4tx\" (UniqueName: \"kubernetes.io/projected/02065558-0ec1-4075-aac4-36ffc7ebb493-kube-api-access-gq4tx\") pod \"kube-state-metrics-69db897b98-9czgc\" (UID: \"02065558-0ec1-4075-aac4-36ffc7ebb493\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.234614 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.234576 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" Apr 17 07:54:42.260456 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.260420 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gdn8b" Apr 17 07:54:42.268781 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:42.268749 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc81e894_4be4_42e1_8d62_b69a7b840a45.slice/crio-4bad49707c226a34803c00f2db0927ba99c8541357fe54d7497693f2c9b9c2c3 WatchSource:0}: Error finding container 4bad49707c226a34803c00f2db0927ba99c8541357fe54d7497693f2c9b9c2c3: Status 404 returned error can't find the container with id 4bad49707c226a34803c00f2db0927ba99c8541357fe54d7497693f2c9b9c2c3 Apr 17 07:54:42.357597 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.357563 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9czgc"] Apr 17 07:54:42.360736 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:42.360704 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02065558_0ec1_4075_aac4_36ffc7ebb493.slice/crio-459a619f83dce2efae1f73c88ab8252fdeae3cf055aa0122e40cd19c295855db WatchSource:0}: Error finding container 459a619f83dce2efae1f73c88ab8252fdeae3cf055aa0122e40cd19c295855db: Status 404 returned error can't find the container with id 459a619f83dce2efae1f73c88ab8252fdeae3cf055aa0122e40cd19c295855db Apr 17 07:54:42.571224 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.571172 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gdn8b" event={"ID":"cc81e894-4be4-42e1-8d62-b69a7b840a45","Type":"ContainerStarted","Data":"4bad49707c226a34803c00f2db0927ba99c8541357fe54d7497693f2c9b9c2c3"} Apr 17 07:54:42.572273 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:42.572245 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" event={"ID":"02065558-0ec1-4075-aac4-36ffc7ebb493","Type":"ContainerStarted","Data":"459a619f83dce2efae1f73c88ab8252fdeae3cf055aa0122e40cd19c295855db"} Apr 17 07:54:43.097735 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.097704 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:54:43.102140 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.102113 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.107259 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.106657 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cssh6\"" Apr 17 07:54:43.107259 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.106657 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 07:54:43.107259 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.106915 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 07:54:43.107259 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.106657 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 07:54:43.107259 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.107066 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 07:54:43.107259 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.107096 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 07:54:43.107259 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.107236 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 07:54:43.107706 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.107363 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 07:54:43.109466 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.109445 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 07:54:43.110121 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.109766 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 07:54:43.134191 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.134162 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:54:43.193250 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193168 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-config-volume\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193250 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193229 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8466\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-kube-api-access-r8466\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193483 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193294 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193483 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193319 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193483 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193352 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-web-config\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193483 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193403 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193483 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193458 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-config-out\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193719 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193485 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193719 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193555 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193719 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193582 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193719 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193608 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193719 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193632 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.193719 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.193674 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295009 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.294976 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295181 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295269 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295223 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295327 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295274 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295327 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295319 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295536 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-config-volume\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295555 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8466\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-kube-api-access-r8466\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295664 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295607 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295664 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295642 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295766 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295676 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-web-config\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295766 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295750 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295861 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295840 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-config-out\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.295914 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.295866 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.296102 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.296080 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.296612 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.296404 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.296734 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.296653 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.298348 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.298320 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.299531 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.299433 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-config-out\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.299531 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.299523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.299867 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.299820 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.300145 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.300106 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-config-volume\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.300145 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.300122 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-web-config\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.300312 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.300194 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.300312 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.300257 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.300423 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.300401 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.304762 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.304733 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8466\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-kube-api-access-r8466\") pod \"alertmanager-main-0\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.414374 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.414294 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:54:43.576324 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.576291 2565 generic.go:358] "Generic (PLEG): container finished" podID="cc81e894-4be4-42e1-8d62-b69a7b840a45" containerID="62c0b16671930fd54cb0dc753b052a659a9a62fd619108300997c9df9874e20c" exitCode=0 Apr 17 07:54:43.576453 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.576382 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gdn8b" event={"ID":"cc81e894-4be4-42e1-8d62-b69a7b840a45","Type":"ContainerDied","Data":"62c0b16671930fd54cb0dc753b052a659a9a62fd619108300997c9df9874e20c"} Apr 17 07:54:43.577990 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.577966 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" event={"ID":"02065558-0ec1-4075-aac4-36ffc7ebb493","Type":"ContainerStarted","Data":"4cfdcdf710bbbac33745bb2940275ff8803432619962399385d9cbac5c671227"} Apr 17 07:54:43.588395 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:43.588373 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:54:43.590974 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:43.590929 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7937961b_49f5_4615_8e12_580c400b92be.slice/crio-e27b0c04784223ebff79379aad1065b7cdc3809da5ac85c661b8a9fdf528d36a WatchSource:0}: Error finding container e27b0c04784223ebff79379aad1065b7cdc3809da5ac85c661b8a9fdf528d36a: Status 404 returned error can't find the container with id e27b0c04784223ebff79379aad1065b7cdc3809da5ac85c661b8a9fdf528d36a Apr 17 07:54:44.582613 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:44.582567 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gdn8b" event={"ID":"cc81e894-4be4-42e1-8d62-b69a7b840a45","Type":"ContainerStarted","Data":"54bb80d44858da6effa8a6dc0de828c672ce616b3cfb633306af736c3efc5974"} Apr 17 07:54:44.582613 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:44.582613 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gdn8b" event={"ID":"cc81e894-4be4-42e1-8d62-b69a7b840a45","Type":"ContainerStarted","Data":"0978032fb2161eab3eca2eac9f767d4f39b93c54ebee297a073f49cce01662c6"} Apr 17 07:54:44.583642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:44.583621 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerStarted","Data":"e27b0c04784223ebff79379aad1065b7cdc3809da5ac85c661b8a9fdf528d36a"} Apr 17 07:54:44.585289 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:44.585266 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" event={"ID":"02065558-0ec1-4075-aac4-36ffc7ebb493","Type":"ContainerStarted","Data":"f92613524f2b5eee80ad971526c028c2da0c090afb9e6d4065b490bdb61277a9"} Apr 17 07:54:44.585371 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:44.585294 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" event={"ID":"02065558-0ec1-4075-aac4-36ffc7ebb493","Type":"ContainerStarted","Data":"51575081e137c5ff21639a03d79a744eea5adfd4b69061da7d4c1292299631fd"} Apr 17 07:54:44.606845 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:44.606797 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gdn8b" podStartSLOduration=2.781801667 podStartE2EDuration="3.606784609s" podCreationTimestamp="2026-04-17 07:54:41 +0000 UTC" firstStartedPulling="2026-04-17 07:54:42.270421859 +0000 UTC m=+194.759302989" lastFinishedPulling="2026-04-17 07:54:43.095404792 +0000 UTC m=+195.584285931" observedRunningTime="2026-04-17 07:54:44.605710615 +0000 UTC m=+197.094591767" watchObservedRunningTime="2026-04-17 07:54:44.606784609 +0000 UTC m=+197.095665760" Apr 17 07:54:44.624851 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:44.624794 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-9czgc" podStartSLOduration=2.491532205 podStartE2EDuration="3.624778628s" podCreationTimestamp="2026-04-17 07:54:41 +0000 UTC" firstStartedPulling="2026-04-17 07:54:42.362586766 +0000 UTC m=+194.851467899" lastFinishedPulling="2026-04-17 07:54:43.495833176 +0000 UTC m=+195.984714322" observedRunningTime="2026-04-17 07:54:44.623558657 +0000 UTC m=+197.112439811" watchObservedRunningTime="2026-04-17 07:54:44.624778628 +0000 UTC m=+197.113659780" Apr 17 07:54:45.590357 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:45.590319 2565 generic.go:358] "Generic (PLEG): container finished" podID="7937961b-49f5-4615-8e12-580c400b92be" containerID="1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c" exitCode=0 Apr 17 07:54:45.590810 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:45.590404 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c"} Apr 17 07:54:46.326373 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.326334 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7b94f87fff-dbzqr"] Apr 17 07:54:46.330093 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.330066 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.332934 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.332861 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:54:46.333469 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.333184 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 07:54:46.333469 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.333291 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 07:54:46.333947 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.333923 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-5ivagb1qp862t\"" Apr 17 07:54:46.334113 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.334095 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 07:54:46.334250 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.334197 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-6zqr6\"" Apr 17 07:54:46.342886 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.342862 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b94f87fff-dbzqr"] Apr 17 07:54:46.425116 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.425078 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-secret-metrics-server-tls\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.425325 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.425130 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-client-ca-bundle\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.425325 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.425187 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c78173-0f99-48c3-abbb-50b961b43510-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.425325 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.425230 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/60c78173-0f99-48c3-abbb-50b961b43510-audit-log\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.425325 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.425267 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-secret-metrics-server-client-certs\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.425503 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.425375 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cm79\" (UniqueName: \"kubernetes.io/projected/60c78173-0f99-48c3-abbb-50b961b43510-kube-api-access-4cm79\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.425503 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.425432 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/60c78173-0f99-48c3-abbb-50b961b43510-metrics-server-audit-profiles\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.526776 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.526739 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cm79\" (UniqueName: \"kubernetes.io/projected/60c78173-0f99-48c3-abbb-50b961b43510-kube-api-access-4cm79\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.526931 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.526798 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/60c78173-0f99-48c3-abbb-50b961b43510-metrics-server-audit-profiles\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.526931 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.526850 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-secret-metrics-server-tls\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.526931 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.526873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-client-ca-bundle\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.526931 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.526893 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c78173-0f99-48c3-abbb-50b961b43510-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.527152 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.526933 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/60c78173-0f99-48c3-abbb-50b961b43510-audit-log\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.527152 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.526968 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-secret-metrics-server-client-certs\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.527681 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.527581 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/60c78173-0f99-48c3-abbb-50b961b43510-audit-log\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.527896 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.527850 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60c78173-0f99-48c3-abbb-50b961b43510-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.528024 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.527900 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/60c78173-0f99-48c3-abbb-50b961b43510-metrics-server-audit-profiles\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.529677 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.529650 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-secret-metrics-server-tls\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.529800 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.529766 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-secret-metrics-server-client-certs\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.529800 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.529791 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c78173-0f99-48c3-abbb-50b961b43510-client-ca-bundle\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.535227 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.535187 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cm79\" (UniqueName: \"kubernetes.io/projected/60c78173-0f99-48c3-abbb-50b961b43510-kube-api-access-4cm79\") pod \"metrics-server-7b94f87fff-dbzqr\" (UID: \"60c78173-0f99-48c3-abbb-50b961b43510\") " pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.642616 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.642501 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:54:46.667104 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.667059 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp"] Apr 17 07:54:46.682689 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.682639 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp"] Apr 17 07:54:46.682874 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.682847 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:46.685398 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.685366 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-zsfv8\"" Apr 17 07:54:46.685866 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.685841 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 07:54:46.830160 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.830120 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4c4gp\" (UID: \"1697098b-61c4-4dbd-b7d3-6338c7f5e46c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:46.931788 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.931240 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4c4gp\" (UID: \"1697098b-61c4-4dbd-b7d3-6338c7f5e46c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:46.931788 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:46.931458 2565 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 07:54:46.931788 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:46.931523 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert podName:1697098b-61c4-4dbd-b7d3-6338c7f5e46c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:47.431502642 +0000 UTC m=+199.920383792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-4c4gp" (UID: "1697098b-61c4-4dbd-b7d3-6338c7f5e46c") : secret "monitoring-plugin-cert" not found Apr 17 07:54:46.991904 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:46.991859 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b94f87fff-dbzqr"] Apr 17 07:54:46.996196 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:46.996165 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c78173_0f99_48c3_abbb_50b961b43510.slice/crio-0470d1822e7233786d46882de769c4e97dc92972a9fd62c57d0abc6cddefc114 WatchSource:0}: Error finding container 0470d1822e7233786d46882de769c4e97dc92972a9fd62c57d0abc6cddefc114: Status 404 returned error can't find the container with id 0470d1822e7233786d46882de769c4e97dc92972a9fd62c57d0abc6cddefc114 Apr 17 07:54:47.436810 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:47.436715 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4c4gp\" (UID: \"1697098b-61c4-4dbd-b7d3-6338c7f5e46c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:47.436979 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:47.436889 2565 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 07:54:47.436979 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:54:47.436974 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert podName:1697098b-61c4-4dbd-b7d3-6338c7f5e46c nodeName:}" failed. No retries permitted until 2026-04-17 07:54:48.436952182 +0000 UTC m=+200.925833327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-4c4gp" (UID: "1697098b-61c4-4dbd-b7d3-6338c7f5e46c") : secret "monitoring-plugin-cert" not found Apr 17 07:54:47.599414 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:47.599371 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" event={"ID":"60c78173-0f99-48c3-abbb-50b961b43510","Type":"ContainerStarted","Data":"0470d1822e7233786d46882de769c4e97dc92972a9fd62c57d0abc6cddefc114"} Apr 17 07:54:47.603434 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:47.603399 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerStarted","Data":"e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125"} Apr 17 07:54:47.603600 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:47.603441 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerStarted","Data":"b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785"} Apr 17 07:54:47.603600 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:47.603454 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerStarted","Data":"c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95"} Apr 17 07:54:47.603600 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:47.603466 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerStarted","Data":"63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62"} Apr 17 07:54:47.603600 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:47.603478 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerStarted","Data":"e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b"} Apr 17 07:54:48.446973 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:48.446931 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4c4gp\" (UID: \"1697098b-61c4-4dbd-b7d3-6338c7f5e46c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:48.449881 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:48.449854 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1697098b-61c4-4dbd-b7d3-6338c7f5e46c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4c4gp\" (UID: \"1697098b-61c4-4dbd-b7d3-6338c7f5e46c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:48.495782 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:48.495744 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:48.610320 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:48.610276 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerStarted","Data":"f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6"} Apr 17 07:54:48.652372 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:48.652320 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.357197155 podStartE2EDuration="5.652304097s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:43.596329375 +0000 UTC m=+196.085210508" lastFinishedPulling="2026-04-17 07:54:47.891436314 +0000 UTC m=+200.380317450" observedRunningTime="2026-04-17 07:54:48.649582724 +0000 UTC m=+201.138463913" watchObservedRunningTime="2026-04-17 07:54:48.652304097 +0000 UTC m=+201.141185248" Apr 17 07:54:48.825387 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:48.825357 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp"] Apr 17 07:54:48.828599 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:48.828572 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1697098b_61c4_4dbd_b7d3_6338c7f5e46c.slice/crio-64dac212e5d707f35ea7d5e9c875378dd271e757dd7dbc5c8c8849c97d9aa024 WatchSource:0}: Error finding container 64dac212e5d707f35ea7d5e9c875378dd271e757dd7dbc5c8c8849c97d9aa024: Status 404 returned error can't find the container with id 64dac212e5d707f35ea7d5e9c875378dd271e757dd7dbc5c8c8849c97d9aa024 Apr 17 07:54:49.614711 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:49.614671 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" event={"ID":"60c78173-0f99-48c3-abbb-50b961b43510","Type":"ContainerStarted","Data":"6a4a57b98ac4fce4496f8be283080f4043215a2b34203c4c54d092a8e558316f"} Apr 17 07:54:49.615764 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:49.615734 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" event={"ID":"1697098b-61c4-4dbd-b7d3-6338c7f5e46c","Type":"ContainerStarted","Data":"64dac212e5d707f35ea7d5e9c875378dd271e757dd7dbc5c8c8849c97d9aa024"} Apr 17 07:54:49.635264 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:49.635200 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" podStartSLOduration=1.886725754 podStartE2EDuration="3.635182998s" podCreationTimestamp="2026-04-17 07:54:46 +0000 UTC" firstStartedPulling="2026-04-17 07:54:46.998155036 +0000 UTC m=+199.487036166" lastFinishedPulling="2026-04-17 07:54:48.746612271 +0000 UTC m=+201.235493410" observedRunningTime="2026-04-17 07:54:49.63389094 +0000 UTC m=+202.122772092" watchObservedRunningTime="2026-04-17 07:54:49.635182998 +0000 UTC m=+202.124064150" Apr 17 07:54:50.619564 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:50.619521 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" event={"ID":"1697098b-61c4-4dbd-b7d3-6338c7f5e46c","Type":"ContainerStarted","Data":"8f4e0bcc20fa69102f7b7efba94bbffbce59004ad7eb8a6e8fb3b0b936b28114"} Apr 17 07:54:50.639642 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:50.639588 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" podStartSLOduration=3.349080386 podStartE2EDuration="4.639570489s" podCreationTimestamp="2026-04-17 07:54:46 +0000 UTC" firstStartedPulling="2026-04-17 07:54:48.830556503 +0000 UTC m=+201.319437634" lastFinishedPulling="2026-04-17 07:54:50.121046592 +0000 UTC m=+202.609927737" observedRunningTime="2026-04-17 07:54:50.637693 +0000 UTC m=+203.126574164" watchObservedRunningTime="2026-04-17 07:54:50.639570489 +0000 UTC m=+203.128451640" Apr 17 07:54:51.623487 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:51.623452 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:51.628352 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:51.628324 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4c4gp" Apr 17 07:54:52.537812 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:52.537786 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fbcd8fb7b-hvmtr" Apr 17 07:54:53.091161 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.091125 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9fc5dbbb-rrd2c"] Apr 17 07:54:53.094618 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.094594 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.097760 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.097738 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:54:53.098348 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.098323 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:54:53.098626 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.098606 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2dvsl\"" Apr 17 07:54:53.098626 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.098614 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:54:53.098801 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.098660 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:54:53.098801 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.098616 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:54:53.098801 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.098615 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:54:53.098801 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.098606 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:54:53.111028 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.110999 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9fc5dbbb-rrd2c"] Apr 17 07:54:53.191055 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.191015 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nkc6\" (UniqueName: \"kubernetes.io/projected/3d7d8091-a563-4616-b5a9-cd5aa8633a43-kube-api-access-4nkc6\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.191055 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.191065 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-config\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.191328 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.191164 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-service-ca\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.191328 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.191200 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-oauth-serving-cert\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.191328 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.191268 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-serving-cert\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.191328 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.191298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-oauth-config\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.292577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.292535 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nkc6\" (UniqueName: \"kubernetes.io/projected/3d7d8091-a563-4616-b5a9-cd5aa8633a43-kube-api-access-4nkc6\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.292577 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.292585 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-config\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.292816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.292631 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-service-ca\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.292816 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.292654 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-oauth-serving-cert\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.292916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.292823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-serving-cert\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.292916 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.292896 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-oauth-config\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.293409 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.293390 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-service-ca\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.293481 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.293420 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-config\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.293481 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.293465 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-oauth-serving-cert\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.295323 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.295304 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-serving-cert\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.295441 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.295421 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-oauth-config\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.308315 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.308280 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nkc6\" (UniqueName: \"kubernetes.io/projected/3d7d8091-a563-4616-b5a9-cd5aa8633a43-kube-api-access-4nkc6\") pod \"console-9fc5dbbb-rrd2c\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.403975 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.403877 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:54:53.529626 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.529592 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9fc5dbbb-rrd2c"] Apr 17 07:54:53.533503 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:54:53.533479 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d7d8091_a563_4616_b5a9_cd5aa8633a43.slice/crio-d51785a0ad4267a28f0dad58362cee7f93c9bd873d975a906f2d8e698f008ba2 WatchSource:0}: Error finding container d51785a0ad4267a28f0dad58362cee7f93c9bd873d975a906f2d8e698f008ba2: Status 404 returned error can't find the container with id d51785a0ad4267a28f0dad58362cee7f93c9bd873d975a906f2d8e698f008ba2 Apr 17 07:54:53.630072 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:53.630033 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fc5dbbb-rrd2c" event={"ID":"3d7d8091-a563-4616-b5a9-cd5aa8633a43","Type":"ContainerStarted","Data":"d51785a0ad4267a28f0dad58362cee7f93c9bd873d975a906f2d8e698f008ba2"} Apr 17 07:54:56.640368 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:56.640292 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fc5dbbb-rrd2c" event={"ID":"3d7d8091-a563-4616-b5a9-cd5aa8633a43","Type":"ContainerStarted","Data":"6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce"} Apr 17 07:54:56.656683 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:54:56.656635 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9fc5dbbb-rrd2c" podStartSLOduration=0.865061525 podStartE2EDuration="3.656619725s" podCreationTimestamp="2026-04-17 07:54:53 +0000 UTC" firstStartedPulling="2026-04-17 07:54:53.535373978 +0000 UTC m=+206.024255108" lastFinishedPulling="2026-04-17 07:54:56.326932169 +0000 UTC m=+208.815813308" observedRunningTime="2026-04-17 07:54:56.655400054 +0000 UTC m=+209.144281217" watchObservedRunningTime="2026-04-17 07:54:56.656619725 +0000 UTC m=+209.145500877" Apr 17 07:55:02.574905 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.574868 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b945859d6-csrg2"] Apr 17 07:55:02.578312 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.578292 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.586613 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.586588 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 07:55:02.587924 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.587899 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b945859d6-csrg2"] Apr 17 07:55:02.670303 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.670265 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxr4\" (UniqueName: \"kubernetes.io/projected/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-kube-api-access-wsxr4\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.670465 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.670372 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-config\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.670465 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.670413 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-serving-cert\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.670465 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.670442 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-oauth-config\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.670465 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.670458 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-trusted-ca-bundle\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.670666 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.670497 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-oauth-serving-cert\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.670666 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.670570 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-service-ca\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.771310 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.771273 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-service-ca\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.771508 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.771329 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxr4\" (UniqueName: \"kubernetes.io/projected/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-kube-api-access-wsxr4\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.771508 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.771378 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-config\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.771508 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.771405 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-serving-cert\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.771508 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.771425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-oauth-config\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.771508 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.771443 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-trusted-ca-bundle\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.771508 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.771470 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-oauth-serving-cert\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.772150 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.772121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-service-ca\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.772359 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.772135 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-config\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.772359 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.772162 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-oauth-serving-cert\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.772964 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.772946 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-trusted-ca-bundle\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.773843 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.773821 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-serving-cert\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.773925 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.773899 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-oauth-config\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.779828 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.779804 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxr4\" (UniqueName: \"kubernetes.io/projected/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-kube-api-access-wsxr4\") pod \"console-7b945859d6-csrg2\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:02.888572 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:02.888462 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:03.009716 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.009611 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b945859d6-csrg2"] Apr 17 07:55:03.012361 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:55:03.012331 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05f0765_d745_43ea_9fb2_2e481b4cd1c3.slice/crio-96e18f6a05ef6cee5717d3a9108663dbbb1017789c32fa2d6aad81a4d7216939 WatchSource:0}: Error finding container 96e18f6a05ef6cee5717d3a9108663dbbb1017789c32fa2d6aad81a4d7216939: Status 404 returned error can't find the container with id 96e18f6a05ef6cee5717d3a9108663dbbb1017789c32fa2d6aad81a4d7216939 Apr 17 07:55:03.404762 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.404702 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:55:03.404762 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.404770 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:55:03.409418 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.409391 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:55:03.662870 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.662779 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b945859d6-csrg2" event={"ID":"f05f0765-d745-43ea-9fb2-2e481b4cd1c3","Type":"ContainerStarted","Data":"389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14"} Apr 17 07:55:03.662870 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.662824 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b945859d6-csrg2" event={"ID":"f05f0765-d745-43ea-9fb2-2e481b4cd1c3","Type":"ContainerStarted","Data":"96e18f6a05ef6cee5717d3a9108663dbbb1017789c32fa2d6aad81a4d7216939"} Apr 17 07:55:03.666610 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.666581 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:55:03.685800 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:03.685749 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b945859d6-csrg2" podStartSLOduration=1.685735097 podStartE2EDuration="1.685735097s" podCreationTimestamp="2026-04-17 07:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:55:03.684194442 +0000 UTC m=+216.173075594" watchObservedRunningTime="2026-04-17 07:55:03.685735097 +0000 UTC m=+216.174616249" Apr 17 07:55:06.643911 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:06.643872 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:55:06.644333 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:06.643946 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:55:10.685568 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:10.685468 2565 generic.go:358] "Generic (PLEG): container finished" podID="fd3831e0-1641-4a43-be1b-520e1a334313" containerID="776e38abd0b64f6bc7df5f44762c68add474c79ac8abb2494d9c3f9abde1eb2a" exitCode=0 Apr 17 07:55:10.685568 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:10.685539 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" event={"ID":"fd3831e0-1641-4a43-be1b-520e1a334313","Type":"ContainerDied","Data":"776e38abd0b64f6bc7df5f44762c68add474c79ac8abb2494d9c3f9abde1eb2a"} Apr 17 07:55:10.685964 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:10.685901 2565 scope.go:117] "RemoveContainer" containerID="776e38abd0b64f6bc7df5f44762c68add474c79ac8abb2494d9c3f9abde1eb2a" Apr 17 07:55:11.690775 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:11.690738 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-hsdfm" event={"ID":"fd3831e0-1641-4a43-be1b-520e1a334313","Type":"ContainerStarted","Data":"a72b780f6f8656a800571f172a3811d7f64aa21be74c8c2a78098f8615c186e1"} Apr 17 07:55:12.889346 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:12.889309 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:12.889824 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:12.889486 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:12.894062 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:12.894041 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:13.699770 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:13.699744 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:55:13.743882 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:13.743846 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9fc5dbbb-rrd2c"] Apr 17 07:55:25.730858 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:25.730821 2565 generic.go:358] "Generic (PLEG): container finished" podID="27196395-61d5-4866-b7d2-ebf227547861" containerID="9f6cd51dcbcecc3d5e42ef1fe2e8170090cf61288a7ea6e45a47afbad300c729" exitCode=0 Apr 17 07:55:25.731271 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:25.730895 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlgcs" event={"ID":"27196395-61d5-4866-b7d2-ebf227547861","Type":"ContainerDied","Data":"9f6cd51dcbcecc3d5e42ef1fe2e8170090cf61288a7ea6e45a47afbad300c729"} Apr 17 07:55:25.731271 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:25.731248 2565 scope.go:117] "RemoveContainer" containerID="9f6cd51dcbcecc3d5e42ef1fe2e8170090cf61288a7ea6e45a47afbad300c729" Apr 17 07:55:26.648275 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:26.648242 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:55:26.656636 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:26.656604 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b94f87fff-dbzqr" Apr 17 07:55:26.734981 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:26.734950 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlgcs" event={"ID":"27196395-61d5-4866-b7d2-ebf227547861","Type":"ContainerStarted","Data":"ab7d93801cfe5297ceaaf0a13cd09053276a0f041d5db2d87910eb2586d37f13"} Apr 17 07:55:26.987255 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:26.987199 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wcz5n_f2e063f6-7071-4323-a26c-9d5f28ce786e/dns-node-resolver/0.log" Apr 17 07:55:30.747122 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:30.747087 2565 generic.go:358] "Generic (PLEG): container finished" podID="cdcde5b0-5ef1-4fd5-b0b7-de55988110a6" containerID="6056563c926bfc7ba04029939e13e77dedb6a0ea425d0977a064f98bc41233ee" exitCode=0 Apr 17 07:55:30.747608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:30.747168 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" event={"ID":"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6","Type":"ContainerDied","Data":"6056563c926bfc7ba04029939e13e77dedb6a0ea425d0977a064f98bc41233ee"} Apr 17 07:55:30.747694 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:30.747629 2565 scope.go:117] "RemoveContainer" containerID="6056563c926bfc7ba04029939e13e77dedb6a0ea425d0977a064f98bc41233ee" Apr 17 07:55:31.751060 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:31.751023 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qfdj6" event={"ID":"cdcde5b0-5ef1-4fd5-b0b7-de55988110a6","Type":"ContainerStarted","Data":"974112836a93f65183faf5e342f46a9ebd8de810cc00dbc7485cd725002381da"} Apr 17 07:55:38.769090 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:38.768979 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9fc5dbbb-rrd2c" podUID="3d7d8091-a563-4616-b5a9-cd5aa8633a43" containerName="console" containerID="cri-o://6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce" gracePeriod=15 Apr 17 07:55:39.030261 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.030167 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9fc5dbbb-rrd2c_3d7d8091-a563-4616-b5a9-cd5aa8633a43/console/0.log" Apr 17 07:55:39.030396 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.030268 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:55:39.214124 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214072 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-oauth-config\") pod \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " Apr 17 07:55:39.214350 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214158 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-service-ca\") pod \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " Apr 17 07:55:39.214350 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214203 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-serving-cert\") pod \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " Apr 17 07:55:39.214350 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214272 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-config\") pod \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " Apr 17 07:55:39.214350 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214314 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nkc6\" (UniqueName: \"kubernetes.io/projected/3d7d8091-a563-4616-b5a9-cd5aa8633a43-kube-api-access-4nkc6\") pod \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " Apr 17 07:55:39.214576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214518 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-oauth-serving-cert\") pod \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\" (UID: \"3d7d8091-a563-4616-b5a9-cd5aa8633a43\") " Apr 17 07:55:39.214576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214528 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-service-ca" (OuterVolumeSpecName: "service-ca") pod "3d7d8091-a563-4616-b5a9-cd5aa8633a43" (UID: "3d7d8091-a563-4616-b5a9-cd5aa8633a43"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:39.214690 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214620 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-config" (OuterVolumeSpecName: "console-config") pod "3d7d8091-a563-4616-b5a9-cd5aa8633a43" (UID: "3d7d8091-a563-4616-b5a9-cd5aa8633a43"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:39.214906 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214856 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:55:39.214906 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214883 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-service-ca\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:55:39.215071 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.214904 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3d7d8091-a563-4616-b5a9-cd5aa8633a43" (UID: "3d7d8091-a563-4616-b5a9-cd5aa8633a43"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:55:39.216666 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.216641 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3d7d8091-a563-4616-b5a9-cd5aa8633a43" (UID: "3d7d8091-a563-4616-b5a9-cd5aa8633a43"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:39.216908 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.216885 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3d7d8091-a563-4616-b5a9-cd5aa8633a43" (UID: "3d7d8091-a563-4616-b5a9-cd5aa8633a43"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:55:39.216908 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.216895 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7d8091-a563-4616-b5a9-cd5aa8633a43-kube-api-access-4nkc6" (OuterVolumeSpecName: "kube-api-access-4nkc6") pod "3d7d8091-a563-4616-b5a9-cd5aa8633a43" (UID: "3d7d8091-a563-4616-b5a9-cd5aa8633a43"). InnerVolumeSpecName "kube-api-access-4nkc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:55:39.315459 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.315373 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-serving-cert\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:55:39.315459 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.315406 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4nkc6\" (UniqueName: \"kubernetes.io/projected/3d7d8091-a563-4616-b5a9-cd5aa8633a43-kube-api-access-4nkc6\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:55:39.315459 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.315416 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d7d8091-a563-4616-b5a9-cd5aa8633a43-oauth-serving-cert\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:55:39.315459 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.315425 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d7d8091-a563-4616-b5a9-cd5aa8633a43-console-oauth-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:55:39.777335 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.777306 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9fc5dbbb-rrd2c_3d7d8091-a563-4616-b5a9-cd5aa8633a43/console/0.log" Apr 17 07:55:39.777740 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.777348 2565 generic.go:358] "Generic (PLEG): container finished" podID="3d7d8091-a563-4616-b5a9-cd5aa8633a43" containerID="6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce" exitCode=2 Apr 17 07:55:39.777740 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.777414 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fc5dbbb-rrd2c" Apr 17 07:55:39.777740 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.777436 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fc5dbbb-rrd2c" event={"ID":"3d7d8091-a563-4616-b5a9-cd5aa8633a43","Type":"ContainerDied","Data":"6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce"} Apr 17 07:55:39.777740 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.777473 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fc5dbbb-rrd2c" event={"ID":"3d7d8091-a563-4616-b5a9-cd5aa8633a43","Type":"ContainerDied","Data":"d51785a0ad4267a28f0dad58362cee7f93c9bd873d975a906f2d8e698f008ba2"} Apr 17 07:55:39.777740 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.777490 2565 scope.go:117] "RemoveContainer" containerID="6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce" Apr 17 07:55:39.787331 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.787313 2565 scope.go:117] "RemoveContainer" containerID="6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce" Apr 17 07:55:39.787674 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:55:39.787654 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce\": container with ID starting with 6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce not found: ID does not exist" containerID="6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce" Apr 17 07:55:39.787739 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.787681 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce"} err="failed to get container status \"6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce\": rpc error: code = NotFound desc = could not find container \"6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce\": container with ID starting with 6c9dc4475a9fe4204fce28fede34b83d2d197b15d7fb4562c41f6133d8ca60ce not found: ID does not exist" Apr 17 07:55:39.797697 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.797672 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9fc5dbbb-rrd2c"] Apr 17 07:55:39.800862 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.800834 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9fc5dbbb-rrd2c"] Apr 17 07:55:39.820716 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.820675 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:55:39.823017 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:39.822991 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63918c32-1f1d-43f2-9243-76c8cb35d556-metrics-certs\") pod \"network-metrics-daemon-k6mnq\" (UID: \"63918c32-1f1d-43f2-9243-76c8cb35d556\") " pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:55:40.043056 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:40.042787 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqz2q\"" Apr 17 07:55:40.044644 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:40.044247 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7d8091-a563-4616-b5a9-cd5aa8633a43" path="/var/lib/kubelet/pods/3d7d8091-a563-4616-b5a9-cd5aa8633a43/volumes" Apr 17 07:55:40.050813 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:40.050754 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6mnq" Apr 17 07:55:40.201981 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:40.201894 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k6mnq"] Apr 17 07:55:40.205000 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:55:40.204972 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63918c32_1f1d_43f2_9243_76c8cb35d556.slice/crio-1f23c7fc9740a9a52d967f66fc206c2392dc504e8106323ce694873f822fe679 WatchSource:0}: Error finding container 1f23c7fc9740a9a52d967f66fc206c2392dc504e8106323ce694873f822fe679: Status 404 returned error can't find the container with id 1f23c7fc9740a9a52d967f66fc206c2392dc504e8106323ce694873f822fe679 Apr 17 07:55:40.784786 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:40.784747 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6mnq" event={"ID":"63918c32-1f1d-43f2-9243-76c8cb35d556","Type":"ContainerStarted","Data":"1f23c7fc9740a9a52d967f66fc206c2392dc504e8106323ce694873f822fe679"} Apr 17 07:55:41.789605 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:41.789566 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6mnq" event={"ID":"63918c32-1f1d-43f2-9243-76c8cb35d556","Type":"ContainerStarted","Data":"d910bb6102adf7c7fc443a273d0c0a74abf94b5b6a3a511c25ac31c57df881c3"} Apr 17 07:55:41.789605 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:41.789608 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6mnq" event={"ID":"63918c32-1f1d-43f2-9243-76c8cb35d556","Type":"ContainerStarted","Data":"a822de7fbe9f6fb239074e0dc2edb39744370e0ce1ed2392acb550c7b3d3e5ed"} Apr 17 07:55:41.804579 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:55:41.804525 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k6mnq" podStartSLOduration=252.782372036 podStartE2EDuration="4m13.804511513s" podCreationTimestamp="2026-04-17 07:51:28 +0000 UTC" firstStartedPulling="2026-04-17 07:55:40.207364624 +0000 UTC m=+252.696245757" lastFinishedPulling="2026-04-17 07:55:41.229504093 +0000 UTC m=+253.718385234" observedRunningTime="2026-04-17 07:55:41.802972958 +0000 UTC m=+254.291854132" watchObservedRunningTime="2026-04-17 07:55:41.804511513 +0000 UTC m=+254.293392666" Apr 17 07:56:02.371584 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.371545 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:02.372051 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.371984 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="alertmanager" containerID="cri-o://e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b" gracePeriod=120 Apr 17 07:56:02.372192 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.372042 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-metric" containerID="cri-o://e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125" gracePeriod=120 Apr 17 07:56:02.372192 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.372082 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-web" containerID="cri-o://c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95" gracePeriod=120 Apr 17 07:56:02.372192 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.372114 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="prom-label-proxy" containerID="cri-o://f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6" gracePeriod=120 Apr 17 07:56:02.372192 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.372100 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="config-reloader" containerID="cri-o://63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62" gracePeriod=120 Apr 17 07:56:02.372192 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.372100 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy" containerID="cri-o://b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785" gracePeriod=120 Apr 17 07:56:02.851591 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851552 2565 generic.go:358] "Generic (PLEG): container finished" podID="7937961b-49f5-4615-8e12-580c400b92be" containerID="f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6" exitCode=0 Apr 17 07:56:02.851591 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851582 2565 generic.go:358] "Generic (PLEG): container finished" podID="7937961b-49f5-4615-8e12-580c400b92be" containerID="e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125" exitCode=0 Apr 17 07:56:02.851591 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851590 2565 generic.go:358] "Generic (PLEG): container finished" podID="7937961b-49f5-4615-8e12-580c400b92be" containerID="b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785" exitCode=0 Apr 17 07:56:02.851591 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851597 2565 generic.go:358] "Generic (PLEG): container finished" podID="7937961b-49f5-4615-8e12-580c400b92be" containerID="63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62" exitCode=0 Apr 17 07:56:02.851591 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851603 2565 generic.go:358] "Generic (PLEG): container finished" podID="7937961b-49f5-4615-8e12-580c400b92be" containerID="e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b" exitCode=0 Apr 17 07:56:02.851880 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851621 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6"} Apr 17 07:56:02.851880 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851654 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125"} Apr 17 07:56:02.851880 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851664 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785"} Apr 17 07:56:02.851880 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851673 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62"} Apr 17 07:56:02.851880 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.851684 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b"} Apr 17 07:56:02.939747 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.939708 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fc4f64cbc-h4mbc"] Apr 17 07:56:02.940078 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.940063 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d7d8091-a563-4616-b5a9-cd5aa8633a43" containerName="console" Apr 17 07:56:02.940119 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.940080 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d8091-a563-4616-b5a9-cd5aa8633a43" containerName="console" Apr 17 07:56:02.940155 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.940140 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d7d8091-a563-4616-b5a9-cd5aa8633a43" containerName="console" Apr 17 07:56:02.943126 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.943108 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:02.953871 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:02.953839 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc4f64cbc-h4mbc"] Apr 17 07:56:03.033290 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.033255 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-service-ca\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.033290 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.033291 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kb7\" (UniqueName: \"kubernetes.io/projected/d0bc4a87-193c-4cb1-b00c-6731b399161d-kube-api-access-j6kb7\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.033519 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.033321 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-config\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.033519 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.033407 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-oauth-serving-cert\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.033519 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.033485 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-trusted-ca-bundle\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.033624 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.033524 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-oauth-config\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.033624 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.033542 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-serving-cert\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.134415 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.134303 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-oauth-config\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.134415 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.134354 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-serving-cert\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.134415 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.134407 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-service-ca\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.134674 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.134431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kb7\" (UniqueName: \"kubernetes.io/projected/d0bc4a87-193c-4cb1-b00c-6731b399161d-kube-api-access-j6kb7\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.134674 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.134472 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-config\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.134674 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.134508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-oauth-serving-cert\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.134674 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.134585 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-trusted-ca-bundle\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.135276 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.135245 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-oauth-serving-cert\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.135276 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.135256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-config\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.135456 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.135245 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-service-ca\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.135456 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.135419 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-trusted-ca-bundle\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.136924 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.136892 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-serving-cert\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.137059 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.137041 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-oauth-config\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.142543 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.142521 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kb7\" (UniqueName: \"kubernetes.io/projected/d0bc4a87-193c-4cb1-b00c-6731b399161d-kube-api-access-j6kb7\") pod \"console-5fc4f64cbc-h4mbc\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.252394 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.252348 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:03.373920 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.373888 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc4f64cbc-h4mbc"] Apr 17 07:56:03.377464 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:56:03.377436 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0bc4a87_193c_4cb1_b00c_6731b399161d.slice/crio-6d47eb4cb74af23300c4d9a745a00fbba4d805858e2c1e2f36b5353ae2c5b19a WatchSource:0}: Error finding container 6d47eb4cb74af23300c4d9a745a00fbba4d805858e2c1e2f36b5353ae2c5b19a: Status 404 returned error can't find the container with id 6d47eb4cb74af23300c4d9a745a00fbba4d805858e2c1e2f36b5353ae2c5b19a Apr 17 07:56:03.616199 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.616176 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:03.741080 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741050 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-config-volume\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741268 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741084 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-metrics-client-ca\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741268 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741105 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-main-db\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741268 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741141 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741406 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741331 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8466\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-kube-api-access-r8466\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741406 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741369 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-trusted-ca-bundle\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741496 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741440 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741496 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741480 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-cluster-tls-config\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741592 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741513 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:03.741592 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741534 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-web-config\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741592 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741521 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:03.741592 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741578 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-config-out\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741784 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741615 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-web\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741784 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741665 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-main-tls\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.741784 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741688 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-tls-assets\") pod \"7937961b-49f5-4615-8e12-580c400b92be\" (UID: \"7937961b-49f5-4615-8e12-580c400b92be\") " Apr 17 07:56:03.742271 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.741981 2565 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-metrics-client-ca\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.742271 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.742005 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-main-db\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.742271 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.742124 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:03.744181 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.744145 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-config-volume" (OuterVolumeSpecName: "config-volume") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:03.744394 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.744194 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:03.744458 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.744390 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:03.744656 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.744547 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-config-out" (OuterVolumeSpecName: "config-out") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:56:03.744656 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.744625 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-kube-api-access-r8466" (OuterVolumeSpecName: "kube-api-access-r8466") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "kube-api-access-r8466". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:03.744787 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.744767 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:03.744840 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.744786 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:03.745658 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.745639 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:03.748536 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.748512 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:03.754808 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.754788 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-web-config" (OuterVolumeSpecName: "web-config") pod "7937961b-49f5-4615-8e12-580c400b92be" (UID: "7937961b-49f5-4615-8e12-580c400b92be"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:03.842712 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842684 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842712 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842709 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r8466\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-kube-api-access-r8466\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842712 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842719 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7937961b-49f5-4615-8e12-580c400b92be-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842729 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842739 2565 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-cluster-tls-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842749 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-web-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842757 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7937961b-49f5-4615-8e12-580c400b92be-config-out\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842774 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842783 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-secret-alertmanager-main-tls\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842792 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7937961b-49f5-4615-8e12-580c400b92be-tls-assets\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.842891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.842800 2565 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7937961b-49f5-4615-8e12-580c400b92be-config-volume\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:03.856042 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.856015 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc4f64cbc-h4mbc" event={"ID":"d0bc4a87-193c-4cb1-b00c-6731b399161d","Type":"ContainerStarted","Data":"3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc"} Apr 17 07:56:03.856143 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.856052 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc4f64cbc-h4mbc" event={"ID":"d0bc4a87-193c-4cb1-b00c-6731b399161d","Type":"ContainerStarted","Data":"6d47eb4cb74af23300c4d9a745a00fbba4d805858e2c1e2f36b5353ae2c5b19a"} Apr 17 07:56:03.858614 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.858594 2565 generic.go:358] "Generic (PLEG): container finished" podID="7937961b-49f5-4615-8e12-580c400b92be" containerID="c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95" exitCode=0 Apr 17 07:56:03.858678 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.858630 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95"} Apr 17 07:56:03.858678 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.858649 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7937961b-49f5-4615-8e12-580c400b92be","Type":"ContainerDied","Data":"e27b0c04784223ebff79379aad1065b7cdc3809da5ac85c661b8a9fdf528d36a"} Apr 17 07:56:03.858678 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.858665 2565 scope.go:117] "RemoveContainer" containerID="f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6" Apr 17 07:56:03.858801 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.858681 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:03.865708 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.865678 2565 scope.go:117] "RemoveContainer" containerID="e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125" Apr 17 07:56:03.872220 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.872191 2565 scope.go:117] "RemoveContainer" containerID="b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785" Apr 17 07:56:03.873157 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.873112 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fc4f64cbc-h4mbc" podStartSLOduration=1.8730967120000002 podStartE2EDuration="1.873096712s" podCreationTimestamp="2026-04-17 07:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:56:03.872144496 +0000 UTC m=+276.361025648" watchObservedRunningTime="2026-04-17 07:56:03.873096712 +0000 UTC m=+276.361977869" Apr 17 07:56:03.883718 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.883687 2565 scope.go:117] "RemoveContainer" containerID="c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95" Apr 17 07:56:03.886455 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.886430 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:03.889795 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.889772 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:03.892163 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.892148 2565 scope.go:117] "RemoveContainer" containerID="63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62" Apr 17 07:56:03.898714 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.898686 2565 scope.go:117] "RemoveContainer" containerID="e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b" Apr 17 07:56:03.905322 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.905300 2565 scope.go:117] "RemoveContainer" containerID="1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c" Apr 17 07:56:03.911759 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.911660 2565 scope.go:117] "RemoveContainer" containerID="f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6" Apr 17 07:56:03.912736 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:03.912711 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6\": container with ID starting with f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6 not found: ID does not exist" containerID="f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6" Apr 17 07:56:03.912817 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.912750 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6"} err="failed to get container status \"f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6\": rpc error: code = NotFound desc = could not find container \"f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6\": container with ID starting with f3bb61204be5372027ba0fd6fdcad0cc6b2da241acdf74256b7ab68be96b19f6 not found: ID does not exist" Apr 17 07:56:03.912817 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.912775 2565 scope.go:117] "RemoveContainer" containerID="e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125" Apr 17 07:56:03.913070 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:03.913052 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125\": container with ID starting with e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125 not found: ID does not exist" containerID="e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125" Apr 17 07:56:03.913192 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913079 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125"} err="failed to get container status \"e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125\": rpc error: code = NotFound desc = could not find container \"e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125\": container with ID starting with e76a4ce3fd78cd28825dfdf507b97bab7a0f939df97b120af46b89c4cef70125 not found: ID does not exist" Apr 17 07:56:03.913192 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913103 2565 scope.go:117] "RemoveContainer" containerID="b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785" Apr 17 07:56:03.913410 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913395 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:03.913461 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:03.913399 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785\": container with ID starting with b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785 not found: ID does not exist" containerID="b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785" Apr 17 07:56:03.913461 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913431 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785"} err="failed to get container status \"b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785\": rpc error: code = NotFound desc = could not find container \"b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785\": container with ID starting with b92c81c8bbbbf8ed6aed09f8a746b6cd00cdddb2f5a966f5229604b4f8c92785 not found: ID does not exist" Apr 17 07:56:03.913461 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913454 2565 scope.go:117] "RemoveContainer" containerID="c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95" Apr 17 07:56:03.913750 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:03.913728 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95\": container with ID starting with c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95 not found: ID does not exist" containerID="c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95" Apr 17 07:56:03.913750 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913745 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="init-config-reloader" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913757 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="init-config-reloader" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913769 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="config-reloader" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913759 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95"} err="failed to get container status \"c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95\": rpc error: code = NotFound desc = could not find container \"c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95\": container with ID starting with c203975178a4baf9ab9128e50118b0de3c10215884254694be15f98f0aaf3b95 not found: ID does not exist" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913775 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="config-reloader" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913784 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-metric" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913792 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-metric" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913800 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913805 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913815 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="prom-label-proxy" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913820 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="prom-label-proxy" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913829 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="alertmanager" Apr 17 07:56:03.913831 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913835 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="alertmanager" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913844 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-web" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913849 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-web" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913782 2565 scope.go:117] "RemoveContainer" containerID="63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913894 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913903 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="alertmanager" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913911 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-web" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913920 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="prom-label-proxy" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913931 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="kube-rbac-proxy-metric" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.913941 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7937961b-49f5-4615-8e12-580c400b92be" containerName="config-reloader" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:03.914114 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62\": container with ID starting with 63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62 not found: ID does not exist" containerID="63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.914148 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62"} err="failed to get container status \"63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62\": rpc error: code = NotFound desc = could not find container \"63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62\": container with ID starting with 63711af070dbc5e4df6fe4a3b6a68d6b083c77d28a38a74e23c54e3cead17b62 not found: ID does not exist" Apr 17 07:56:03.914286 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.914172 2565 scope.go:117] "RemoveContainer" containerID="e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b" Apr 17 07:56:03.914655 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:03.914439 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b\": container with ID starting with e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b not found: ID does not exist" containerID="e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b" Apr 17 07:56:03.914655 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.914460 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b"} err="failed to get container status \"e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b\": rpc error: code = NotFound desc = could not find container \"e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b\": container with ID starting with e11f147d5a3c4c30e86c3fe709ad2e0719675a71a1056de030b856756d579d9b not found: ID does not exist" Apr 17 07:56:03.914655 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.914474 2565 scope.go:117] "RemoveContainer" containerID="1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c" Apr 17 07:56:03.914765 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:03.914709 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c\": container with ID starting with 1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c not found: ID does not exist" containerID="1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c" Apr 17 07:56:03.914765 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.914753 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c"} err="failed to get container status \"1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c\": rpc error: code = NotFound desc = could not find container \"1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c\": container with ID starting with 1d97b7c8893fb9518d2cf105f0541771c4d0bd2796ffa3f55d35ac5fa73aaf5c not found: ID does not exist" Apr 17 07:56:03.919068 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.919052 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:03.921382 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.921349 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 07:56:03.921570 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.921353 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 07:56:03.921570 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.921392 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 07:56:03.921570 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.921442 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 07:56:03.921570 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.921505 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 07:56:03.921970 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.921808 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 07:56:03.921970 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.921953 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 07:56:03.922090 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.922009 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cssh6\"" Apr 17 07:56:03.922090 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.922010 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 07:56:03.926143 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.926059 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 07:56:03.929048 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:03.929028 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:04.043265 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.043190 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7937961b-49f5-4615-8e12-580c400b92be" path="/var/lib/kubelet/pods/7937961b-49f5-4615-8e12-580c400b92be/volumes" Apr 17 07:56:04.045314 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045292 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045390 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045340 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74ad00b-b762-4945-ac23-5e04425bd6cd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045390 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045364 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045476 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045390 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045476 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045415 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c74ad00b-b762-4945-ac23-5e04425bd6cd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045576 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045541 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8zw\" (UniqueName: \"kubernetes.io/projected/c74ad00b-b762-4945-ac23-5e04425bd6cd-kube-api-access-6k8zw\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045633 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045591 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-config-volume\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045633 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045624 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c74ad00b-b762-4945-ac23-5e04425bd6cd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045724 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045653 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ad00b-b762-4945-ac23-5e04425bd6cd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045724 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045683 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045724 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045710 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-web-config\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045871 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045753 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74ad00b-b762-4945-ac23-5e04425bd6cd-config-out\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.045871 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.045808 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.146706 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146676 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8zw\" (UniqueName: \"kubernetes.io/projected/c74ad00b-b762-4945-ac23-5e04425bd6cd-kube-api-access-6k8zw\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.146840 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146723 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-config-volume\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.146840 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146748 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c74ad00b-b762-4945-ac23-5e04425bd6cd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.146840 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146776 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ad00b-b762-4945-ac23-5e04425bd6cd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.146840 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.146840 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146831 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-web-config\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146872 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74ad00b-b762-4945-ac23-5e04425bd6cd-config-out\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146909 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146959 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.146997 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74ad00b-b762-4945-ac23-5e04425bd6cd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.147018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.147040 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147088 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.147069 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c74ad00b-b762-4945-ac23-5e04425bd6cd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.147442 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.147424 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c74ad00b-b762-4945-ac23-5e04425bd6cd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.148341 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.148279 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c74ad00b-b762-4945-ac23-5e04425bd6cd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.148603 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.148510 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c74ad00b-b762-4945-ac23-5e04425bd6cd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.149845 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.149822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74ad00b-b762-4945-ac23-5e04425bd6cd-config-out\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.150070 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.150050 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.150137 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.150116 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.150486 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.150468 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74ad00b-b762-4945-ac23-5e04425bd6cd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.150645 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.150627 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.150700 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.150686 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.150748 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.150723 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-config-volume\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.150945 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.150929 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-web-config\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.151789 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.151774 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c74ad00b-b762-4945-ac23-5e04425bd6cd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.154485 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.154467 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8zw\" (UniqueName: \"kubernetes.io/projected/c74ad00b-b762-4945-ac23-5e04425bd6cd-kube-api-access-6k8zw\") pod \"alertmanager-main-0\" (UID: \"c74ad00b-b762-4945-ac23-5e04425bd6cd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.229906 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.229864 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 07:56:04.361014 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.360988 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 07:56:04.364063 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:56:04.364039 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc74ad00b_b762_4945_ac23_5e04425bd6cd.slice/crio-8b71974ca54cec633892c37be994f1cfd1b40239e9571dcf810c3a92a88ae4c7 WatchSource:0}: Error finding container 8b71974ca54cec633892c37be994f1cfd1b40239e9571dcf810c3a92a88ae4c7: Status 404 returned error can't find the container with id 8b71974ca54cec633892c37be994f1cfd1b40239e9571dcf810c3a92a88ae4c7 Apr 17 07:56:04.864460 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.864428 2565 generic.go:358] "Generic (PLEG): container finished" podID="c74ad00b-b762-4945-ac23-5e04425bd6cd" containerID="deabd4fd28319bc5e1a249c6fd31965d83679d8c39f8f11e7b806fe1a794311a" exitCode=0 Apr 17 07:56:04.864897 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.864483 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerDied","Data":"deabd4fd28319bc5e1a249c6fd31965d83679d8c39f8f11e7b806fe1a794311a"} Apr 17 07:56:04.864897 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:04.864527 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerStarted","Data":"8b71974ca54cec633892c37be994f1cfd1b40239e9571dcf810c3a92a88ae4c7"} Apr 17 07:56:05.870788 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:05.870753 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerStarted","Data":"a8ea513f2e46b42995e89ba223a3e0a37659e48d1bb8daa617af38bcbc5a8457"} Apr 17 07:56:05.870788 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:05.870788 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerStarted","Data":"6b33ca2451c43669294e7a5736b0d5520b04b1f3d4e17a55f38fc6612ac0d084"} Apr 17 07:56:05.871170 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:05.870798 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerStarted","Data":"59d4dd86c9305b6d72ae6341a45d16ff0572afbc31bbe26e3162a90ca2ff51fa"} Apr 17 07:56:05.871170 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:05.870810 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerStarted","Data":"657d75fbddad7f67c65ff98b078883fc4a31291c8c7cc09fcb2ee1de7993c957"} Apr 17 07:56:05.871170 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:05.870817 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerStarted","Data":"f4368e10e85d6319be108da98d1138f2086717d0e9a46a4f4098b56d863a8ecc"} Apr 17 07:56:05.871170 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:05.870824 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c74ad00b-b762-4945-ac23-5e04425bd6cd","Type":"ContainerStarted","Data":"2591cff74c37b9b194746a6e9958776a81c53145873460023461c43871d8b749"} Apr 17 07:56:05.895490 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:05.895436 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.895419532 podStartE2EDuration="2.895419532s" podCreationTimestamp="2026-04-17 07:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:56:05.89432407 +0000 UTC m=+278.383205225" watchObservedRunningTime="2026-04-17 07:56:05.895419532 +0000 UTC m=+278.384300686" Apr 17 07:56:08.451602 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:08.451558 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-z5xmb" podUID="7b550782-b0e2-4efb-9013-806a1ec8d616" Apr 17 07:56:08.879262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:08.879204 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z5xmb" Apr 17 07:56:11.311520 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.311475 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:56:11.313861 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.313837 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b550782-b0e2-4efb-9013-806a1ec8d616-metrics-tls\") pod \"dns-default-z5xmb\" (UID: \"7b550782-b0e2-4efb-9013-806a1ec8d616\") " pod="openshift-dns/dns-default-z5xmb" Apr 17 07:56:11.412031 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.411994 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:56:11.414423 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.414392 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66caf165-b357-465a-87dc-24e5229f236e-cert\") pod \"ingress-canary-62zbn\" (UID: \"66caf165-b357-465a-87dc-24e5229f236e\") " pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:56:11.443891 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.443859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tgj8b\"" Apr 17 07:56:11.451871 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.451843 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-62zbn" Apr 17 07:56:11.574743 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.574718 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-62zbn"] Apr 17 07:56:11.577442 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:56:11.577412 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66caf165_b357_465a_87dc_24e5229f236e.slice/crio-a306084ec5d38e3fb2c4b5328a7a44fa9ef5d544ce22250081569a8f54dd342e WatchSource:0}: Error finding container a306084ec5d38e3fb2c4b5328a7a44fa9ef5d544ce22250081569a8f54dd342e: Status 404 returned error can't find the container with id a306084ec5d38e3fb2c4b5328a7a44fa9ef5d544ce22250081569a8f54dd342e Apr 17 07:56:11.582355 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.582336 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l9k4x\"" Apr 17 07:56:11.591149 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.591119 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z5xmb" Apr 17 07:56:11.719207 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.719170 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z5xmb"] Apr 17 07:56:11.722830 ip-10-0-133-228 kubenswrapper[2565]: W0417 07:56:11.722803 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b550782_b0e2_4efb_9013_806a1ec8d616.slice/crio-96db74bfc4d3fca46c7f733bef04b6b854541d339d67a2f56d50e46d51c990d5 WatchSource:0}: Error finding container 96db74bfc4d3fca46c7f733bef04b6b854541d339d67a2f56d50e46d51c990d5: Status 404 returned error can't find the container with id 96db74bfc4d3fca46c7f733bef04b6b854541d339d67a2f56d50e46d51c990d5 Apr 17 07:56:11.890460 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.890360 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-62zbn" event={"ID":"66caf165-b357-465a-87dc-24e5229f236e","Type":"ContainerStarted","Data":"a306084ec5d38e3fb2c4b5328a7a44fa9ef5d544ce22250081569a8f54dd342e"} Apr 17 07:56:11.891365 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:11.891331 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z5xmb" event={"ID":"7b550782-b0e2-4efb-9013-806a1ec8d616","Type":"ContainerStarted","Data":"96db74bfc4d3fca46c7f733bef04b6b854541d339d67a2f56d50e46d51c990d5"} Apr 17 07:56:13.253171 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.253127 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:13.253616 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.253190 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:13.258475 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.258453 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:13.900335 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.900244 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-62zbn" event={"ID":"66caf165-b357-465a-87dc-24e5229f236e","Type":"ContainerStarted","Data":"bc295dc391d6ebea24f116d0668cd783ffc60075632b4f17da6d7528712d33bd"} Apr 17 07:56:13.901875 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.901851 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z5xmb" event={"ID":"7b550782-b0e2-4efb-9013-806a1ec8d616","Type":"ContainerStarted","Data":"0e7bf90826cda90092ebfbc3cce3e4ad6b3ef23994b3daaab8daa606604ad82c"} Apr 17 07:56:13.902004 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.901880 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z5xmb" event={"ID":"7b550782-b0e2-4efb-9013-806a1ec8d616","Type":"ContainerStarted","Data":"de0282d7df35d409d0b54dd1b31c4053a1a08ca56bd67fdc716287c84dad7d01"} Apr 17 07:56:13.902004 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.901944 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-z5xmb" Apr 17 07:56:13.905901 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.905883 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 07:56:13.915488 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.915449 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-62zbn" podStartSLOduration=250.893742519 podStartE2EDuration="4m12.915436098s" podCreationTimestamp="2026-04-17 07:52:01 +0000 UTC" firstStartedPulling="2026-04-17 07:56:11.57911006 +0000 UTC m=+284.067991194" lastFinishedPulling="2026-04-17 07:56:13.600803627 +0000 UTC m=+286.089684773" observedRunningTime="2026-04-17 07:56:13.914423498 +0000 UTC m=+286.403304663" watchObservedRunningTime="2026-04-17 07:56:13.915436098 +0000 UTC m=+286.404317249" Apr 17 07:56:13.946493 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.946427 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z5xmb" podStartSLOduration=251.068678359 podStartE2EDuration="4m12.946406546s" podCreationTimestamp="2026-04-17 07:52:01 +0000 UTC" firstStartedPulling="2026-04-17 07:56:11.724737909 +0000 UTC m=+284.213619039" lastFinishedPulling="2026-04-17 07:56:13.602466093 +0000 UTC m=+286.091347226" observedRunningTime="2026-04-17 07:56:13.944113447 +0000 UTC m=+286.432994601" watchObservedRunningTime="2026-04-17 07:56:13.946406546 +0000 UTC m=+286.435287699" Apr 17 07:56:13.960543 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:13.960504 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b945859d6-csrg2"] Apr 17 07:56:23.909434 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:23.909401 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z5xmb" Apr 17 07:56:27.978229 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:27.978188 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 07:56:27.978229 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:27.978199 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 07:56:38.986676 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:38.986639 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b945859d6-csrg2" podUID="f05f0765-d745-43ea-9fb2-2e481b4cd1c3" containerName="console" containerID="cri-o://389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14" gracePeriod=15 Apr 17 07:56:39.228451 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.228424 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b945859d6-csrg2_f05f0765-d745-43ea-9fb2-2e481b4cd1c3/console/0.log" Apr 17 07:56:39.228585 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.228487 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:56:39.352260 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352147 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-serving-cert\") pod \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " Apr 17 07:56:39.352411 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352267 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-trusted-ca-bundle\") pod \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " Apr 17 07:56:39.352411 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352291 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsxr4\" (UniqueName: \"kubernetes.io/projected/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-kube-api-access-wsxr4\") pod \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " Apr 17 07:56:39.352411 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352310 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-oauth-config\") pod \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " Apr 17 07:56:39.352411 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352337 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-oauth-serving-cert\") pod \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " Apr 17 07:56:39.352411 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352382 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-config\") pod \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " Apr 17 07:56:39.352650 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352445 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-service-ca\") pod \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\" (UID: \"f05f0765-d745-43ea-9fb2-2e481b4cd1c3\") " Apr 17 07:56:39.352809 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352780 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f05f0765-d745-43ea-9fb2-2e481b4cd1c3" (UID: "f05f0765-d745-43ea-9fb2-2e481b4cd1c3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:39.352887 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352808 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f05f0765-d745-43ea-9fb2-2e481b4cd1c3" (UID: "f05f0765-d745-43ea-9fb2-2e481b4cd1c3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:39.352945 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352827 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-config" (OuterVolumeSpecName: "console-config") pod "f05f0765-d745-43ea-9fb2-2e481b4cd1c3" (UID: "f05f0765-d745-43ea-9fb2-2e481b4cd1c3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:39.353036 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.352971 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-service-ca" (OuterVolumeSpecName: "service-ca") pod "f05f0765-d745-43ea-9fb2-2e481b4cd1c3" (UID: "f05f0765-d745-43ea-9fb2-2e481b4cd1c3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:56:39.354493 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.354474 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f05f0765-d745-43ea-9fb2-2e481b4cd1c3" (UID: "f05f0765-d745-43ea-9fb2-2e481b4cd1c3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:39.354845 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.354825 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f05f0765-d745-43ea-9fb2-2e481b4cd1c3" (UID: "f05f0765-d745-43ea-9fb2-2e481b4cd1c3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:56:39.354909 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.354827 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-kube-api-access-wsxr4" (OuterVolumeSpecName: "kube-api-access-wsxr4") pod "f05f0765-d745-43ea-9fb2-2e481b4cd1c3" (UID: "f05f0765-d745-43ea-9fb2-2e481b4cd1c3"). InnerVolumeSpecName "kube-api-access-wsxr4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:56:39.453736 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.453698 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-trusted-ca-bundle\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:39.453736 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.453730 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wsxr4\" (UniqueName: \"kubernetes.io/projected/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-kube-api-access-wsxr4\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:39.453736 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.453741 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-oauth-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:39.453957 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.453753 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-oauth-serving-cert\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:39.453957 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.453763 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:39.453957 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.453772 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-service-ca\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:39.453957 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.453780 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f05f0765-d745-43ea-9fb2-2e481b4cd1c3-console-serving-cert\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 07:56:39.986429 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.986400 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b945859d6-csrg2_f05f0765-d745-43ea-9fb2-2e481b4cd1c3/console/0.log" Apr 17 07:56:39.986608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.986442 2565 generic.go:358] "Generic (PLEG): container finished" podID="f05f0765-d745-43ea-9fb2-2e481b4cd1c3" containerID="389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14" exitCode=2 Apr 17 07:56:39.986608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.986470 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b945859d6-csrg2" event={"ID":"f05f0765-d745-43ea-9fb2-2e481b4cd1c3","Type":"ContainerDied","Data":"389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14"} Apr 17 07:56:39.986608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.986492 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b945859d6-csrg2" event={"ID":"f05f0765-d745-43ea-9fb2-2e481b4cd1c3","Type":"ContainerDied","Data":"96e18f6a05ef6cee5717d3a9108663dbbb1017789c32fa2d6aad81a4d7216939"} Apr 17 07:56:39.986608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.986506 2565 scope.go:117] "RemoveContainer" containerID="389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14" Apr 17 07:56:39.986608 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.986515 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b945859d6-csrg2" Apr 17 07:56:39.995016 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.994878 2565 scope.go:117] "RemoveContainer" containerID="389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14" Apr 17 07:56:39.995196 ip-10-0-133-228 kubenswrapper[2565]: E0417 07:56:39.995178 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14\": container with ID starting with 389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14 not found: ID does not exist" containerID="389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14" Apr 17 07:56:39.995262 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:39.995226 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14"} err="failed to get container status \"389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14\": rpc error: code = NotFound desc = could not find container \"389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14\": container with ID starting with 389a78b8c9f2d5f181ced308dec6c4155c04e88b089709e54036890580e91d14 not found: ID does not exist" Apr 17 07:56:40.005316 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:40.005289 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b945859d6-csrg2"] Apr 17 07:56:40.007938 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:40.007912 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b945859d6-csrg2"] Apr 17 07:56:40.042847 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:56:40.042809 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05f0765-d745-43ea-9fb2-2e481b4cd1c3" path="/var/lib/kubelet/pods/f05f0765-d745-43ea-9fb2-2e481b4cd1c3/volumes" Apr 17 07:59:58.422450 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.422415 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cf69fd875-4t7rp"] Apr 17 07:59:58.422910 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.422741 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f05f0765-d745-43ea-9fb2-2e481b4cd1c3" containerName="console" Apr 17 07:59:58.422910 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.422752 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05f0765-d745-43ea-9fb2-2e481b4cd1c3" containerName="console" Apr 17 07:59:58.422910 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.422802 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="f05f0765-d745-43ea-9fb2-2e481b4cd1c3" containerName="console" Apr 17 07:59:58.425610 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.425588 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.445515 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.445480 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf69fd875-4t7rp"] Apr 17 07:59:58.524744 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.524707 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-oauth-serving-cert\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.524744 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.524753 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-console-config\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.524967 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.524809 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b3278f-7ef0-483b-8da5-89d782a41536-console-oauth-config\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.524967 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.524871 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b3278f-7ef0-483b-8da5-89d782a41536-console-serving-cert\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.524967 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.524887 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-trusted-ca-bundle\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.524967 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.524906 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-service-ca\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.525091 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.525003 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbxvv\" (UniqueName: \"kubernetes.io/projected/a9b3278f-7ef0-483b-8da5-89d782a41536-kube-api-access-gbxvv\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626003 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.625955 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbxvv\" (UniqueName: \"kubernetes.io/projected/a9b3278f-7ef0-483b-8da5-89d782a41536-kube-api-access-gbxvv\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626195 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626016 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-oauth-serving-cert\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626195 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626048 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-console-config\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626195 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626073 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b3278f-7ef0-483b-8da5-89d782a41536-console-oauth-config\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626195 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626100 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b3278f-7ef0-483b-8da5-89d782a41536-console-serving-cert\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626195 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626121 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-trusted-ca-bundle\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626195 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626148 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-service-ca\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626838 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626763 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-oauth-serving-cert\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626969 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626892 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-console-config\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.626969 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.626895 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-service-ca\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.627187 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.627164 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b3278f-7ef0-483b-8da5-89d782a41536-trusted-ca-bundle\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.628759 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.628718 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b3278f-7ef0-483b-8da5-89d782a41536-console-oauth-config\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.628759 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.628737 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b3278f-7ef0-483b-8da5-89d782a41536-console-serving-cert\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.633813 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.633795 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbxvv\" (UniqueName: \"kubernetes.io/projected/a9b3278f-7ef0-483b-8da5-89d782a41536-kube-api-access-gbxvv\") pod \"console-6cf69fd875-4t7rp\" (UID: \"a9b3278f-7ef0-483b-8da5-89d782a41536\") " pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.734748 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.734712 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 07:59:58.859600 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.859562 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf69fd875-4t7rp"] Apr 17 07:59:58.868510 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:58.868488 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:59:59.576252 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:59.576195 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf69fd875-4t7rp" event={"ID":"a9b3278f-7ef0-483b-8da5-89d782a41536","Type":"ContainerStarted","Data":"9146cd1e25e64c8ab607be8023f55c76a71acf25af5e083f9bbfe51f6fb26bbb"} Apr 17 07:59:59.576252 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:59.576252 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf69fd875-4t7rp" event={"ID":"a9b3278f-7ef0-483b-8da5-89d782a41536","Type":"ContainerStarted","Data":"fcd908f1daab5dc28b029f428c0563f4013b132bae65cef4946dca30724a2925"} Apr 17 07:59:59.593725 ip-10-0-133-228 kubenswrapper[2565]: I0417 07:59:59.593669 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cf69fd875-4t7rp" podStartSLOduration=1.5936527630000001 podStartE2EDuration="1.593652763s" podCreationTimestamp="2026-04-17 07:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:59:59.592392952 +0000 UTC m=+512.081274116" watchObservedRunningTime="2026-04-17 07:59:59.593652763 +0000 UTC m=+512.082533915" Apr 17 08:00:08.735734 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:08.735641 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 08:00:08.736138 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:08.735787 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 08:00:08.740596 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:08.740574 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 08:00:09.609758 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:09.609731 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cf69fd875-4t7rp" Apr 17 08:00:09.655315 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:09.655280 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fc4f64cbc-h4mbc"] Apr 17 08:00:34.680236 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:34.680170 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fc4f64cbc-h4mbc" podUID="d0bc4a87-193c-4cb1-b00c-6731b399161d" containerName="console" containerID="cri-o://3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc" gracePeriod=15 Apr 17 08:00:34.918159 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:34.918138 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fc4f64cbc-h4mbc_d0bc4a87-193c-4cb1-b00c-6731b399161d/console/0.log" Apr 17 08:00:34.918305 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:34.918197 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 08:00:35.042257 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042224 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-config\") pod \"d0bc4a87-193c-4cb1-b00c-6731b399161d\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " Apr 17 08:00:35.042433 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042264 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-trusted-ca-bundle\") pod \"d0bc4a87-193c-4cb1-b00c-6731b399161d\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " Apr 17 08:00:35.042433 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042308 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-oauth-serving-cert\") pod \"d0bc4a87-193c-4cb1-b00c-6731b399161d\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " Apr 17 08:00:35.042555 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042480 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-serving-cert\") pod \"d0bc4a87-193c-4cb1-b00c-6731b399161d\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " Apr 17 08:00:35.042609 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042568 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6kb7\" (UniqueName: \"kubernetes.io/projected/d0bc4a87-193c-4cb1-b00c-6731b399161d-kube-api-access-j6kb7\") pod \"d0bc4a87-193c-4cb1-b00c-6731b399161d\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " Apr 17 08:00:35.042660 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042604 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-oauth-config\") pod \"d0bc4a87-193c-4cb1-b00c-6731b399161d\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " Apr 17 08:00:35.042712 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042676 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-config" (OuterVolumeSpecName: "console-config") pod "d0bc4a87-193c-4cb1-b00c-6731b399161d" (UID: "d0bc4a87-193c-4cb1-b00c-6731b399161d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:35.042712 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042697 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d0bc4a87-193c-4cb1-b00c-6731b399161d" (UID: "d0bc4a87-193c-4cb1-b00c-6731b399161d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:35.042823 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042707 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d0bc4a87-193c-4cb1-b00c-6731b399161d" (UID: "d0bc4a87-193c-4cb1-b00c-6731b399161d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:35.042823 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.042744 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-service-ca\") pod \"d0bc4a87-193c-4cb1-b00c-6731b399161d\" (UID: \"d0bc4a87-193c-4cb1-b00c-6731b399161d\") " Apr 17 08:00:35.043140 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.043114 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-service-ca" (OuterVolumeSpecName: "service-ca") pod "d0bc4a87-193c-4cb1-b00c-6731b399161d" (UID: "d0bc4a87-193c-4cb1-b00c-6731b399161d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:00:35.043244 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.043177 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:00:35.043244 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.043189 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-trusted-ca-bundle\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:00:35.043244 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.043202 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-oauth-serving-cert\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:00:35.044734 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.044704 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bc4a87-193c-4cb1-b00c-6731b399161d-kube-api-access-j6kb7" (OuterVolumeSpecName: "kube-api-access-j6kb7") pod "d0bc4a87-193c-4cb1-b00c-6731b399161d" (UID: "d0bc4a87-193c-4cb1-b00c-6731b399161d"). InnerVolumeSpecName "kube-api-access-j6kb7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:00:35.045118 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.045094 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d0bc4a87-193c-4cb1-b00c-6731b399161d" (UID: "d0bc4a87-193c-4cb1-b00c-6731b399161d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:00:35.045186 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.045113 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d0bc4a87-193c-4cb1-b00c-6731b399161d" (UID: "d0bc4a87-193c-4cb1-b00c-6731b399161d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:00:35.143982 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.143944 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d0bc4a87-193c-4cb1-b00c-6731b399161d-service-ca\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:00:35.143982 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.143976 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-serving-cert\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:00:35.143982 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.143986 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6kb7\" (UniqueName: \"kubernetes.io/projected/d0bc4a87-193c-4cb1-b00c-6731b399161d-kube-api-access-j6kb7\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:00:35.144253 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.143997 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d0bc4a87-193c-4cb1-b00c-6731b399161d-console-oauth-config\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:00:35.691529 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.691500 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fc4f64cbc-h4mbc_d0bc4a87-193c-4cb1-b00c-6731b399161d/console/0.log" Apr 17 08:00:35.691933 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.691539 2565 generic.go:358] "Generic (PLEG): container finished" podID="d0bc4a87-193c-4cb1-b00c-6731b399161d" containerID="3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc" exitCode=2 Apr 17 08:00:35.691933 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.691567 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc4f64cbc-h4mbc" event={"ID":"d0bc4a87-193c-4cb1-b00c-6731b399161d","Type":"ContainerDied","Data":"3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc"} Apr 17 08:00:35.691933 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.691606 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc4f64cbc-h4mbc" event={"ID":"d0bc4a87-193c-4cb1-b00c-6731b399161d","Type":"ContainerDied","Data":"6d47eb4cb74af23300c4d9a745a00fbba4d805858e2c1e2f36b5353ae2c5b19a"} Apr 17 08:00:35.691933 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.691610 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc4f64cbc-h4mbc" Apr 17 08:00:35.691933 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.691621 2565 scope.go:117] "RemoveContainer" containerID="3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc" Apr 17 08:00:35.700768 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.700744 2565 scope.go:117] "RemoveContainer" containerID="3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc" Apr 17 08:00:35.701040 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:00:35.701018 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc\": container with ID starting with 3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc not found: ID does not exist" containerID="3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc" Apr 17 08:00:35.701110 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.701049 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc"} err="failed to get container status \"3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc\": rpc error: code = NotFound desc = could not find container \"3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc\": container with ID starting with 3185024656f9b62387123b5a07e32f84b3e932a9f5195aed45f404a6d06ab3dc not found: ID does not exist" Apr 17 08:00:35.714089 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.714057 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fc4f64cbc-h4mbc"] Apr 17 08:00:35.718480 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:35.718458 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fc4f64cbc-h4mbc"] Apr 17 08:00:36.043896 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:00:36.043864 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bc4a87-193c-4cb1-b00c-6731b399161d" path="/var/lib/kubelet/pods/d0bc4a87-193c-4cb1-b00c-6731b399161d/volumes" Apr 17 08:01:02.221411 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.221373 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh"] Apr 17 08:01:02.221867 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.221693 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0bc4a87-193c-4cb1-b00c-6731b399161d" containerName="console" Apr 17 08:01:02.221867 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.221705 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bc4a87-193c-4cb1-b00c-6731b399161d" containerName="console" Apr 17 08:01:02.221867 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.221774 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0bc4a87-193c-4cb1-b00c-6731b399161d" containerName="console" Apr 17 08:01:02.225067 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.225046 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.227293 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.227269 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 08:01:02.227434 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.227415 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 08:01:02.228079 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.228064 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mfg6m\"" Apr 17 08:01:02.232902 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.232883 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh"] Apr 17 08:01:02.277191 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.277146 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9ht\" (UniqueName: \"kubernetes.io/projected/118df768-3af6-443c-a1f3-124c3f079d8a-kube-api-access-qq9ht\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.277191 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.277189 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.277421 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.277308 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.378206 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.378167 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9ht\" (UniqueName: \"kubernetes.io/projected/118df768-3af6-443c-a1f3-124c3f079d8a-kube-api-access-qq9ht\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.378206 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.378228 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.378381 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.378281 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.378647 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.378628 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.378684 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.378653 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.386580 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.386542 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9ht\" (UniqueName: \"kubernetes.io/projected/118df768-3af6-443c-a1f3-124c3f079d8a-kube-api-access-qq9ht\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.535537 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.535457 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:02.666579 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.666540 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh"] Apr 17 08:01:02.669894 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:01:02.669854 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod118df768_3af6_443c_a1f3_124c3f079d8a.slice/crio-b9f963df981cbbcf69e373c0c3628cb9c1e4b44fe4813bf21dc72382ca7dce53 WatchSource:0}: Error finding container b9f963df981cbbcf69e373c0c3628cb9c1e4b44fe4813bf21dc72382ca7dce53: Status 404 returned error can't find the container with id b9f963df981cbbcf69e373c0c3628cb9c1e4b44fe4813bf21dc72382ca7dce53 Apr 17 08:01:02.775841 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:02.775801 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" event={"ID":"118df768-3af6-443c-a1f3-124c3f079d8a","Type":"ContainerStarted","Data":"b9f963df981cbbcf69e373c0c3628cb9c1e4b44fe4813bf21dc72382ca7dce53"} Apr 17 08:01:08.797252 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:08.797147 2565 generic.go:358] "Generic (PLEG): container finished" podID="118df768-3af6-443c-a1f3-124c3f079d8a" containerID="15a99237f7c5cee561e752b103aa4e9cd9ed40ccdde9c414706c9575ceb01351" exitCode=0 Apr 17 08:01:08.797252 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:08.797194 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" event={"ID":"118df768-3af6-443c-a1f3-124c3f079d8a","Type":"ContainerDied","Data":"15a99237f7c5cee561e752b103aa4e9cd9ed40ccdde9c414706c9575ceb01351"} Apr 17 08:01:11.809891 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:11.809857 2565 generic.go:358] "Generic (PLEG): container finished" podID="118df768-3af6-443c-a1f3-124c3f079d8a" containerID="38cd34dd6b87ca05897c5317e2c3082230698d646716f1175ad8b81a00e4f07c" exitCode=0 Apr 17 08:01:11.810387 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:11.809913 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" event={"ID":"118df768-3af6-443c-a1f3-124c3f079d8a","Type":"ContainerDied","Data":"38cd34dd6b87ca05897c5317e2c3082230698d646716f1175ad8b81a00e4f07c"} Apr 17 08:01:18.834668 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:18.834633 2565 generic.go:358] "Generic (PLEG): container finished" podID="118df768-3af6-443c-a1f3-124c3f079d8a" containerID="1aeebcc619efd24ff6989a9fd0779a93a0380710ce26ed3d1368f0d7108ca762" exitCode=0 Apr 17 08:01:18.835060 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:18.834714 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" event={"ID":"118df768-3af6-443c-a1f3-124c3f079d8a","Type":"ContainerDied","Data":"1aeebcc619efd24ff6989a9fd0779a93a0380710ce26ed3d1368f0d7108ca762"} Apr 17 08:01:19.974533 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:19.974509 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:20.046685 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.046650 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-util\") pod \"118df768-3af6-443c-a1f3-124c3f079d8a\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " Apr 17 08:01:20.046879 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.046770 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9ht\" (UniqueName: \"kubernetes.io/projected/118df768-3af6-443c-a1f3-124c3f079d8a-kube-api-access-qq9ht\") pod \"118df768-3af6-443c-a1f3-124c3f079d8a\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " Apr 17 08:01:20.046879 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.046823 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-bundle\") pod \"118df768-3af6-443c-a1f3-124c3f079d8a\" (UID: \"118df768-3af6-443c-a1f3-124c3f079d8a\") " Apr 17 08:01:20.047568 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.047537 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-bundle" (OuterVolumeSpecName: "bundle") pod "118df768-3af6-443c-a1f3-124c3f079d8a" (UID: "118df768-3af6-443c-a1f3-124c3f079d8a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:01:20.049151 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.049126 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118df768-3af6-443c-a1f3-124c3f079d8a-kube-api-access-qq9ht" (OuterVolumeSpecName: "kube-api-access-qq9ht") pod "118df768-3af6-443c-a1f3-124c3f079d8a" (UID: "118df768-3af6-443c-a1f3-124c3f079d8a"). InnerVolumeSpecName "kube-api-access-qq9ht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:01:20.052071 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.052046 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-util" (OuterVolumeSpecName: "util") pod "118df768-3af6-443c-a1f3-124c3f079d8a" (UID: "118df768-3af6-443c-a1f3-124c3f079d8a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:01:20.147607 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.147496 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-util\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:01:20.147607 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.147533 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qq9ht\" (UniqueName: \"kubernetes.io/projected/118df768-3af6-443c-a1f3-124c3f079d8a-kube-api-access-qq9ht\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:01:20.147607 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.147544 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118df768-3af6-443c-a1f3-124c3f079d8a-bundle\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:01:20.851823 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.851788 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" event={"ID":"118df768-3af6-443c-a1f3-124c3f079d8a","Type":"ContainerDied","Data":"b9f963df981cbbcf69e373c0c3628cb9c1e4b44fe4813bf21dc72382ca7dce53"} Apr 17 08:01:20.851823 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.851828 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9f963df981cbbcf69e373c0c3628cb9c1e4b44fe4813bf21dc72382ca7dce53" Apr 17 08:01:20.852030 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:20.851795 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb9qsh" Apr 17 08:01:23.845728 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.845691 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd"] Apr 17 08:01:23.846169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.846017 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="118df768-3af6-443c-a1f3-124c3f079d8a" containerName="util" Apr 17 08:01:23.846169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.846029 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="118df768-3af6-443c-a1f3-124c3f079d8a" containerName="util" Apr 17 08:01:23.846169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.846048 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="118df768-3af6-443c-a1f3-124c3f079d8a" containerName="pull" Apr 17 08:01:23.846169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.846054 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="118df768-3af6-443c-a1f3-124c3f079d8a" containerName="pull" Apr 17 08:01:23.846169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.846061 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="118df768-3af6-443c-a1f3-124c3f079d8a" containerName="extract" Apr 17 08:01:23.846169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.846067 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="118df768-3af6-443c-a1f3-124c3f079d8a" containerName="extract" Apr 17 08:01:23.846169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.846144 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="118df768-3af6-443c-a1f3-124c3f079d8a" containerName="extract" Apr 17 08:01:23.852874 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.852851 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:23.855015 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.854993 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 08:01:23.855194 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.854995 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 08:01:23.855194 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.855033 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gkc9d\"" Apr 17 08:01:23.856546 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.856525 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 08:01:23.859121 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.859098 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd"] Apr 17 08:01:23.982852 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.982806 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/10927611-ab58-4bb7-9f3b-135418da3b00-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd\" (UID: \"10927611-ab58-4bb7-9f3b-135418da3b00\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:23.983036 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:23.982894 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298pl\" (UniqueName: \"kubernetes.io/projected/10927611-ab58-4bb7-9f3b-135418da3b00-kube-api-access-298pl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd\" (UID: \"10927611-ab58-4bb7-9f3b-135418da3b00\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:24.083852 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:24.083815 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/10927611-ab58-4bb7-9f3b-135418da3b00-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd\" (UID: \"10927611-ab58-4bb7-9f3b-135418da3b00\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:24.084067 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:24.083986 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-298pl\" (UniqueName: \"kubernetes.io/projected/10927611-ab58-4bb7-9f3b-135418da3b00-kube-api-access-298pl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd\" (UID: \"10927611-ab58-4bb7-9f3b-135418da3b00\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:24.086471 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:24.086446 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/10927611-ab58-4bb7-9f3b-135418da3b00-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd\" (UID: \"10927611-ab58-4bb7-9f3b-135418da3b00\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:24.092435 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:24.092403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-298pl\" (UniqueName: \"kubernetes.io/projected/10927611-ab58-4bb7-9f3b-135418da3b00-kube-api-access-298pl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd\" (UID: \"10927611-ab58-4bb7-9f3b-135418da3b00\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:24.164863 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:24.164766 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:24.294018 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:24.293990 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd"] Apr 17 08:01:24.296683 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:01:24.296654 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10927611_ab58_4bb7_9f3b_135418da3b00.slice/crio-0e13e2b3c06ce3f301a92b9090c9304a4f175b738196c36c51d554ef56185df0 WatchSource:0}: Error finding container 0e13e2b3c06ce3f301a92b9090c9304a4f175b738196c36c51d554ef56185df0: Status 404 returned error can't find the container with id 0e13e2b3c06ce3f301a92b9090c9304a4f175b738196c36c51d554ef56185df0 Apr 17 08:01:24.864922 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:24.864889 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" event={"ID":"10927611-ab58-4bb7-9f3b-135418da3b00","Type":"ContainerStarted","Data":"0e13e2b3c06ce3f301a92b9090c9304a4f175b738196c36c51d554ef56185df0"} Apr 17 08:01:28.776768 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:28.776735 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:01:28.777429 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:28.777413 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:01:29.620342 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.620308 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-fsqxt"] Apr 17 08:01:29.647079 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.647038 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-fsqxt"] Apr 17 08:01:29.647259 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.647161 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.649373 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.649347 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 08:01:29.649510 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.649389 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-p4tfj\"" Apr 17 08:01:29.649510 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.649444 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 08:01:29.842825 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.842776 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.843275 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.842863 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2db\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-kube-api-access-bc2db\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.843275 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.842928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/93f12956-a5b9-41c5-acc8-31db83f207c3-cabundle0\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.884768 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.884680 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" event={"ID":"10927611-ab58-4bb7-9f3b-135418da3b00","Type":"ContainerStarted","Data":"e4a698a79da4d5ff77e926dee8c627db65595366e9d952b6dc80e193de0fb2fb"} Apr 17 08:01:29.884930 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.884793 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:01:29.905442 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.905382 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" podStartSLOduration=2.124869223 podStartE2EDuration="6.905366121s" podCreationTimestamp="2026-04-17 08:01:23 +0000 UTC" firstStartedPulling="2026-04-17 08:01:24.298639369 +0000 UTC m=+596.787520499" lastFinishedPulling="2026-04-17 08:01:29.079136261 +0000 UTC m=+601.568017397" observedRunningTime="2026-04-17 08:01:29.904095246 +0000 UTC m=+602.392976397" watchObservedRunningTime="2026-04-17 08:01:29.905366121 +0000 UTC m=+602.394247272" Apr 17 08:01:29.943804 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.943763 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2db\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-kube-api-access-bc2db\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.944009 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.943834 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/93f12956-a5b9-41c5-acc8-31db83f207c3-cabundle0\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.944009 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.943898 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.944139 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:29.944045 2565 secret.go:281] references non-existent secret key: ca.crt Apr 17 08:01:29.944139 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:29.944073 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 08:01:29.944139 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:29.944088 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-fsqxt: references non-existent secret key: ca.crt Apr 17 08:01:29.944305 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:29.944159 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates podName:93f12956-a5b9-41c5-acc8-31db83f207c3 nodeName:}" failed. No retries permitted until 2026-04-17 08:01:30.444135222 +0000 UTC m=+602.933016370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates") pod "keda-operator-ffbb595cb-fsqxt" (UID: "93f12956-a5b9-41c5-acc8-31db83f207c3") : references non-existent secret key: ca.crt Apr 17 08:01:29.944689 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.944667 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/93f12956-a5b9-41c5-acc8-31db83f207c3-cabundle0\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:29.951621 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:29.951594 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2db\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-kube-api-access-bc2db\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:30.448179 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:30.448131 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:30.448390 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:30.448307 2565 secret.go:281] references non-existent secret key: ca.crt Apr 17 08:01:30.448390 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:30.448328 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 08:01:30.448390 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:30.448337 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-fsqxt: references non-existent secret key: ca.crt Apr 17 08:01:30.448390 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:30.448390 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates podName:93f12956-a5b9-41c5-acc8-31db83f207c3 nodeName:}" failed. No retries permitted until 2026-04-17 08:01:31.448375148 +0000 UTC m=+603.937256283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates") pod "keda-operator-ffbb595cb-fsqxt" (UID: "93f12956-a5b9-41c5-acc8-31db83f207c3") : references non-existent secret key: ca.crt Apr 17 08:01:31.458404 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:31.458364 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:31.458798 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:31.458515 2565 secret.go:281] references non-existent secret key: ca.crt Apr 17 08:01:31.458798 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:31.458535 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 08:01:31.458798 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:31.458546 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-fsqxt: references non-existent secret key: ca.crt Apr 17 08:01:31.458798 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:31.458609 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates podName:93f12956-a5b9-41c5-acc8-31db83f207c3 nodeName:}" failed. No retries permitted until 2026-04-17 08:01:33.458593314 +0000 UTC m=+605.947474443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates") pod "keda-operator-ffbb595cb-fsqxt" (UID: "93f12956-a5b9-41c5-acc8-31db83f207c3") : references non-existent secret key: ca.crt Apr 17 08:01:33.474767 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:33.474726 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:33.475146 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:33.474863 2565 secret.go:281] references non-existent secret key: ca.crt Apr 17 08:01:33.475146 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:33.474884 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 08:01:33.475146 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:33.474894 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-fsqxt: references non-existent secret key: ca.crt Apr 17 08:01:33.475146 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:01:33.474959 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates podName:93f12956-a5b9-41c5-acc8-31db83f207c3 nodeName:}" failed. No retries permitted until 2026-04-17 08:01:37.474942342 +0000 UTC m=+609.963823476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates") pod "keda-operator-ffbb595cb-fsqxt" (UID: "93f12956-a5b9-41c5-acc8-31db83f207c3") : references non-existent secret key: ca.crt Apr 17 08:01:37.503472 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:37.503371 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:37.505998 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:37.505970 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/93f12956-a5b9-41c5-acc8-31db83f207c3-certificates\") pod \"keda-operator-ffbb595cb-fsqxt\" (UID: \"93f12956-a5b9-41c5-acc8-31db83f207c3\") " pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:37.757694 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:37.757602 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:37.874177 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:37.874151 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-fsqxt"] Apr 17 08:01:37.876993 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:01:37.876962 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f12956_a5b9_41c5_acc8_31db83f207c3.slice/crio-0c962c718c439ba1b1a431401c1803fa8ed6e5cb3c134600c321aa07d798e967 WatchSource:0}: Error finding container 0c962c718c439ba1b1a431401c1803fa8ed6e5cb3c134600c321aa07d798e967: Status 404 returned error can't find the container with id 0c962c718c439ba1b1a431401c1803fa8ed6e5cb3c134600c321aa07d798e967 Apr 17 08:01:37.912105 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:37.912072 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" event={"ID":"93f12956-a5b9-41c5-acc8-31db83f207c3","Type":"ContainerStarted","Data":"0c962c718c439ba1b1a431401c1803fa8ed6e5cb3c134600c321aa07d798e967"} Apr 17 08:01:40.923395 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:40.923359 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" event={"ID":"93f12956-a5b9-41c5-acc8-31db83f207c3","Type":"ContainerStarted","Data":"04e3364bffd696c1b890417217c0c1c344f6b9876b91c0b78224f00000145105"} Apr 17 08:01:40.923797 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:40.923511 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:01:40.939376 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:40.939320 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" podStartSLOduration=9.03534882 podStartE2EDuration="11.93930701s" podCreationTimestamp="2026-04-17 08:01:29 +0000 UTC" firstStartedPulling="2026-04-17 08:01:37.878280536 +0000 UTC m=+610.367161667" lastFinishedPulling="2026-04-17 08:01:40.782238725 +0000 UTC m=+613.271119857" observedRunningTime="2026-04-17 08:01:40.938395152 +0000 UTC m=+613.427276305" watchObservedRunningTime="2026-04-17 08:01:40.93930701 +0000 UTC m=+613.428188163" Apr 17 08:01:50.890064 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:01:50.890024 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-8n7rd" Apr 17 08:02:01.929643 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:01.929605 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-fsqxt" Apr 17 08:02:36.270348 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.270303 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-44glw"] Apr 17 08:02:36.278734 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.278697 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x"] Apr 17 08:02:36.278895 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.278824 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.282255 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.282199 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:02:36.282407 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.282201 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-v4wxz\"" Apr 17 08:02:36.282407 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.282384 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:02:36.282530 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.282237 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:36.282639 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.282618 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 08:02:36.283583 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.283555 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-44glw"] Apr 17 08:02:36.284332 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.284313 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 08:02:36.284562 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.284545 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-7g4cs\"" Apr 17 08:02:36.289559 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.289540 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x"] Apr 17 08:02:36.299469 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.299436 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert\") pod \"kserve-controller-manager-558564fd68-44glw\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.299626 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.299493 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c041ea6-e048-48f4-8830-64e163f148ff-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q7s8x\" (UID: \"4c041ea6-e048-48f4-8830-64e163f148ff\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:36.299626 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.299537 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tklrl\" (UniqueName: \"kubernetes.io/projected/a65b53e1-f002-4b7d-8e30-782d68598981-kube-api-access-tklrl\") pod \"kserve-controller-manager-558564fd68-44glw\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.299819 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.299795 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db6ht\" (UniqueName: \"kubernetes.io/projected/4c041ea6-e048-48f4-8830-64e163f148ff-kube-api-access-db6ht\") pod \"llmisvc-controller-manager-68cc5db7c4-q7s8x\" (UID: \"4c041ea6-e048-48f4-8830-64e163f148ff\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:36.401527 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.401489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert\") pod \"kserve-controller-manager-558564fd68-44glw\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.401527 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.401527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c041ea6-e048-48f4-8830-64e163f148ff-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q7s8x\" (UID: \"4c041ea6-e048-48f4-8830-64e163f148ff\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:36.401786 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.401555 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tklrl\" (UniqueName: \"kubernetes.io/projected/a65b53e1-f002-4b7d-8e30-782d68598981-kube-api-access-tklrl\") pod \"kserve-controller-manager-558564fd68-44glw\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.401786 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.401587 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db6ht\" (UniqueName: \"kubernetes.io/projected/4c041ea6-e048-48f4-8830-64e163f148ff-kube-api-access-db6ht\") pod \"llmisvc-controller-manager-68cc5db7c4-q7s8x\" (UID: \"4c041ea6-e048-48f4-8830-64e163f148ff\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:36.401786 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:02:36.401660 2565 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 17 08:02:36.401786 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:02:36.401699 2565 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 08:02:36.401786 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:02:36.401749 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c041ea6-e048-48f4-8830-64e163f148ff-cert podName:4c041ea6-e048-48f4-8830-64e163f148ff nodeName:}" failed. No retries permitted until 2026-04-17 08:02:36.901724739 +0000 UTC m=+669.390605889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c041ea6-e048-48f4-8830-64e163f148ff-cert") pod "llmisvc-controller-manager-68cc5db7c4-q7s8x" (UID: "4c041ea6-e048-48f4-8830-64e163f148ff") : secret "llmisvc-webhook-server-cert" not found Apr 17 08:02:36.401786 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:02:36.401765 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert podName:a65b53e1-f002-4b7d-8e30-782d68598981 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:36.901758297 +0000 UTC m=+669.390639432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert") pod "kserve-controller-manager-558564fd68-44glw" (UID: "a65b53e1-f002-4b7d-8e30-782d68598981") : secret "kserve-webhook-server-cert" not found Apr 17 08:02:36.410914 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.410878 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db6ht\" (UniqueName: \"kubernetes.io/projected/4c041ea6-e048-48f4-8830-64e163f148ff-kube-api-access-db6ht\") pod \"llmisvc-controller-manager-68cc5db7c4-q7s8x\" (UID: \"4c041ea6-e048-48f4-8830-64e163f148ff\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:36.413153 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.413131 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tklrl\" (UniqueName: \"kubernetes.io/projected/a65b53e1-f002-4b7d-8e30-782d68598981-kube-api-access-tklrl\") pod \"kserve-controller-manager-558564fd68-44glw\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.907422 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.907382 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert\") pod \"kserve-controller-manager-558564fd68-44glw\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.907422 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.907425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c041ea6-e048-48f4-8830-64e163f148ff-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q7s8x\" (UID: \"4c041ea6-e048-48f4-8830-64e163f148ff\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:36.909914 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.909884 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert\") pod \"kserve-controller-manager-558564fd68-44glw\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:36.910009 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:36.909918 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c041ea6-e048-48f4-8830-64e163f148ff-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q7s8x\" (UID: \"4c041ea6-e048-48f4-8830-64e163f148ff\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:37.193849 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:37.193749 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:37.199646 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:37.199614 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:37.326626 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:37.326588 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-44glw"] Apr 17 08:02:37.328791 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:02:37.328736 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda65b53e1_f002_4b7d_8e30_782d68598981.slice/crio-2a62d45f5dab6c15a7d7b51fec74d778d587e0b54d9387d5398b42b37559f3fd WatchSource:0}: Error finding container 2a62d45f5dab6c15a7d7b51fec74d778d587e0b54d9387d5398b42b37559f3fd: Status 404 returned error can't find the container with id 2a62d45f5dab6c15a7d7b51fec74d778d587e0b54d9387d5398b42b37559f3fd Apr 17 08:02:37.347443 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:37.347422 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x"] Apr 17 08:02:37.349922 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:02:37.349896 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c041ea6_e048_48f4_8830_64e163f148ff.slice/crio-075d7a466b1c3113bf2627c9d90243049976855cfa18bd515148171c955298d0 WatchSource:0}: Error finding container 075d7a466b1c3113bf2627c9d90243049976855cfa18bd515148171c955298d0: Status 404 returned error can't find the container with id 075d7a466b1c3113bf2627c9d90243049976855cfa18bd515148171c955298d0 Apr 17 08:02:38.117722 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:38.117653 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-44glw" event={"ID":"a65b53e1-f002-4b7d-8e30-782d68598981","Type":"ContainerStarted","Data":"2a62d45f5dab6c15a7d7b51fec74d778d587e0b54d9387d5398b42b37559f3fd"} Apr 17 08:02:38.119735 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:38.119638 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" event={"ID":"4c041ea6-e048-48f4-8830-64e163f148ff","Type":"ContainerStarted","Data":"075d7a466b1c3113bf2627c9d90243049976855cfa18bd515148171c955298d0"} Apr 17 08:02:41.131562 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:41.131509 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" event={"ID":"4c041ea6-e048-48f4-8830-64e163f148ff","Type":"ContainerStarted","Data":"afae5c83a7135a9f84811b405bdfba33c3815f693b8467567651b7a8450729d1"} Apr 17 08:02:41.132107 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:41.131608 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:02:41.132957 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:41.132927 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-44glw" event={"ID":"a65b53e1-f002-4b7d-8e30-782d68598981","Type":"ContainerStarted","Data":"f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd"} Apr 17 08:02:41.133083 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:41.133034 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:02:41.147351 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:41.147306 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" podStartSLOduration=1.96255468 podStartE2EDuration="5.147293603s" podCreationTimestamp="2026-04-17 08:02:36 +0000 UTC" firstStartedPulling="2026-04-17 08:02:37.351339112 +0000 UTC m=+669.840220243" lastFinishedPulling="2026-04-17 08:02:40.53607803 +0000 UTC m=+673.024959166" observedRunningTime="2026-04-17 08:02:41.146414642 +0000 UTC m=+673.635295797" watchObservedRunningTime="2026-04-17 08:02:41.147293603 +0000 UTC m=+673.636174835" Apr 17 08:02:41.163668 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:02:41.163491 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-44glw" podStartSLOduration=1.95776865 podStartE2EDuration="5.163474579s" podCreationTimestamp="2026-04-17 08:02:36 +0000 UTC" firstStartedPulling="2026-04-17 08:02:37.330148509 +0000 UTC m=+669.819029648" lastFinishedPulling="2026-04-17 08:02:40.535854427 +0000 UTC m=+673.024735577" observedRunningTime="2026-04-17 08:02:41.162960163 +0000 UTC m=+673.651841330" watchObservedRunningTime="2026-04-17 08:02:41.163474579 +0000 UTC m=+673.652355732" Apr 17 08:03:12.139427 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:12.139393 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q7s8x" Apr 17 08:03:12.142616 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:12.142590 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:03:13.612430 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.612395 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-44glw"] Apr 17 08:03:13.612853 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.612635 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-558564fd68-44glw" podUID="a65b53e1-f002-4b7d-8e30-782d68598981" containerName="manager" containerID="cri-o://f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd" gracePeriod=10 Apr 17 08:03:13.636988 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.636950 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-z6v8s"] Apr 17 08:03:13.645923 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.645893 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:13.652202 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.652172 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-z6v8s"] Apr 17 08:03:13.744501 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.744457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rl49\" (UniqueName: \"kubernetes.io/projected/b691469f-c21a-4775-a20e-6e3bea432807-kube-api-access-9rl49\") pod \"kserve-controller-manager-558564fd68-z6v8s\" (UID: \"b691469f-c21a-4775-a20e-6e3bea432807\") " pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:13.744694 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.744605 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b691469f-c21a-4775-a20e-6e3bea432807-cert\") pod \"kserve-controller-manager-558564fd68-z6v8s\" (UID: \"b691469f-c21a-4775-a20e-6e3bea432807\") " pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:13.845370 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.845338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rl49\" (UniqueName: \"kubernetes.io/projected/b691469f-c21a-4775-a20e-6e3bea432807-kube-api-access-9rl49\") pod \"kserve-controller-manager-558564fd68-z6v8s\" (UID: \"b691469f-c21a-4775-a20e-6e3bea432807\") " pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:13.845513 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.845421 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b691469f-c21a-4775-a20e-6e3bea432807-cert\") pod \"kserve-controller-manager-558564fd68-z6v8s\" (UID: \"b691469f-c21a-4775-a20e-6e3bea432807\") " pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:13.847852 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.847829 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b691469f-c21a-4775-a20e-6e3bea432807-cert\") pod \"kserve-controller-manager-558564fd68-z6v8s\" (UID: \"b691469f-c21a-4775-a20e-6e3bea432807\") " pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:13.853400 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.853366 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rl49\" (UniqueName: \"kubernetes.io/projected/b691469f-c21a-4775-a20e-6e3bea432807-kube-api-access-9rl49\") pod \"kserve-controller-manager-558564fd68-z6v8s\" (UID: \"b691469f-c21a-4775-a20e-6e3bea432807\") " pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:13.864990 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.864922 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:03:13.999489 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:13.999451 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:14.046919 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.046868 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tklrl\" (UniqueName: \"kubernetes.io/projected/a65b53e1-f002-4b7d-8e30-782d68598981-kube-api-access-tklrl\") pod \"a65b53e1-f002-4b7d-8e30-782d68598981\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " Apr 17 08:03:14.046919 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.046917 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert\") pod \"a65b53e1-f002-4b7d-8e30-782d68598981\" (UID: \"a65b53e1-f002-4b7d-8e30-782d68598981\") " Apr 17 08:03:14.049943 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.049910 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert" (OuterVolumeSpecName: "cert") pod "a65b53e1-f002-4b7d-8e30-782d68598981" (UID: "a65b53e1-f002-4b7d-8e30-782d68598981"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:03:14.050556 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.050533 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65b53e1-f002-4b7d-8e30-782d68598981-kube-api-access-tklrl" (OuterVolumeSpecName: "kube-api-access-tklrl") pod "a65b53e1-f002-4b7d-8e30-782d68598981" (UID: "a65b53e1-f002-4b7d-8e30-782d68598981"). InnerVolumeSpecName "kube-api-access-tklrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:03:14.125984 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.125957 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-z6v8s"] Apr 17 08:03:14.128380 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:03:14.128350 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb691469f_c21a_4775_a20e_6e3bea432807.slice/crio-07125e5d7dc82e172e1426bdddd6e7ce9bb7291e5353a7e6ac5f4674c6580c9e WatchSource:0}: Error finding container 07125e5d7dc82e172e1426bdddd6e7ce9bb7291e5353a7e6ac5f4674c6580c9e: Status 404 returned error can't find the container with id 07125e5d7dc82e172e1426bdddd6e7ce9bb7291e5353a7e6ac5f4674c6580c9e Apr 17 08:03:14.148395 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.148363 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tklrl\" (UniqueName: \"kubernetes.io/projected/a65b53e1-f002-4b7d-8e30-782d68598981-kube-api-access-tklrl\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:03:14.148395 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.148393 2565 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a65b53e1-f002-4b7d-8e30-782d68598981-cert\") on node \"ip-10-0-133-228.ec2.internal\" DevicePath \"\"" Apr 17 08:03:14.252627 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.252587 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-z6v8s" event={"ID":"b691469f-c21a-4775-a20e-6e3bea432807","Type":"ContainerStarted","Data":"07125e5d7dc82e172e1426bdddd6e7ce9bb7291e5353a7e6ac5f4674c6580c9e"} Apr 17 08:03:14.253758 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.253733 2565 generic.go:358] "Generic (PLEG): container finished" podID="a65b53e1-f002-4b7d-8e30-782d68598981" containerID="f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd" exitCode=0 Apr 17 08:03:14.253870 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.253794 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-44glw" event={"ID":"a65b53e1-f002-4b7d-8e30-782d68598981","Type":"ContainerDied","Data":"f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd"} Apr 17 08:03:14.253870 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.253798 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-44glw" Apr 17 08:03:14.253870 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.253817 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-44glw" event={"ID":"a65b53e1-f002-4b7d-8e30-782d68598981","Type":"ContainerDied","Data":"2a62d45f5dab6c15a7d7b51fec74d778d587e0b54d9387d5398b42b37559f3fd"} Apr 17 08:03:14.253870 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.253832 2565 scope.go:117] "RemoveContainer" containerID="f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd" Apr 17 08:03:14.262433 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.262411 2565 scope.go:117] "RemoveContainer" containerID="f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd" Apr 17 08:03:14.262742 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:03:14.262722 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd\": container with ID starting with f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd not found: ID does not exist" containerID="f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd" Apr 17 08:03:14.262800 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.262750 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd"} err="failed to get container status \"f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd\": rpc error: code = NotFound desc = could not find container \"f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd\": container with ID starting with f5f5d812cebfc42367d3da22e836cf1c71cebfc7599f9a876ebc504833916ddd not found: ID does not exist" Apr 17 08:03:14.274099 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.274066 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-44glw"] Apr 17 08:03:14.277590 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:14.277564 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-44glw"] Apr 17 08:03:15.259672 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:15.259633 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-z6v8s" event={"ID":"b691469f-c21a-4775-a20e-6e3bea432807","Type":"ContainerStarted","Data":"2665b9104b78b977aee05c702e5491d44d51d3bd193662d87f480704e8ef2838"} Apr 17 08:03:15.260158 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:15.259781 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:15.276275 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:15.276201 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-z6v8s" podStartSLOduration=1.8648997 podStartE2EDuration="2.276185327s" podCreationTimestamp="2026-04-17 08:03:13 +0000 UTC" firstStartedPulling="2026-04-17 08:03:14.129649463 +0000 UTC m=+706.618530592" lastFinishedPulling="2026-04-17 08:03:14.540935079 +0000 UTC m=+707.029816219" observedRunningTime="2026-04-17 08:03:15.273533538 +0000 UTC m=+707.762414692" watchObservedRunningTime="2026-04-17 08:03:15.276185327 +0000 UTC m=+707.765066479" Apr 17 08:03:16.044401 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:16.044371 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65b53e1-f002-4b7d-8e30-782d68598981" path="/var/lib/kubelet/pods/a65b53e1-f002-4b7d-8e30-782d68598981/volumes" Apr 17 08:03:46.267815 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:46.267778 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-z6v8s" Apr 17 08:03:47.158704 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.158668 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-52b6g"] Apr 17 08:03:47.159166 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.159147 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a65b53e1-f002-4b7d-8e30-782d68598981" containerName="manager" Apr 17 08:03:47.159166 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.159167 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65b53e1-f002-4b7d-8e30-782d68598981" containerName="manager" Apr 17 08:03:47.159342 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.159305 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a65b53e1-f002-4b7d-8e30-782d68598981" containerName="manager" Apr 17 08:03:47.162370 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.162352 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:47.164547 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.164522 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-nbk6v\"" Apr 17 08:03:47.164648 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.164579 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 08:03:47.174002 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.173977 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-52b6g"] Apr 17 08:03:47.233462 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.233421 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-cert\") pod \"odh-model-controller-696fc77849-52b6g\" (UID: \"b03f7029-34ba-45bd-bb5f-cf4c85d600d9\") " pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:47.233637 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.233488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvq8z\" (UniqueName: \"kubernetes.io/projected/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-kube-api-access-jvq8z\") pod \"odh-model-controller-696fc77849-52b6g\" (UID: \"b03f7029-34ba-45bd-bb5f-cf4c85d600d9\") " pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:47.334415 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.334382 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-cert\") pod \"odh-model-controller-696fc77849-52b6g\" (UID: \"b03f7029-34ba-45bd-bb5f-cf4c85d600d9\") " pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:47.334828 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.334435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvq8z\" (UniqueName: \"kubernetes.io/projected/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-kube-api-access-jvq8z\") pod \"odh-model-controller-696fc77849-52b6g\" (UID: \"b03f7029-34ba-45bd-bb5f-cf4c85d600d9\") " pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:47.334828 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:03:47.334545 2565 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 08:03:47.334828 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:03:47.334620 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-cert podName:b03f7029-34ba-45bd-bb5f-cf4c85d600d9 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:47.834602419 +0000 UTC m=+740.323483549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-cert") pod "odh-model-controller-696fc77849-52b6g" (UID: "b03f7029-34ba-45bd-bb5f-cf4c85d600d9") : secret "odh-model-controller-webhook-cert" not found Apr 17 08:03:47.343011 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.342982 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvq8z\" (UniqueName: \"kubernetes.io/projected/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-kube-api-access-jvq8z\") pod \"odh-model-controller-696fc77849-52b6g\" (UID: \"b03f7029-34ba-45bd-bb5f-cf4c85d600d9\") " pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:47.839692 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.839657 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-cert\") pod \"odh-model-controller-696fc77849-52b6g\" (UID: \"b03f7029-34ba-45bd-bb5f-cf4c85d600d9\") " pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:47.842142 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:47.842123 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b03f7029-34ba-45bd-bb5f-cf4c85d600d9-cert\") pod \"odh-model-controller-696fc77849-52b6g\" (UID: \"b03f7029-34ba-45bd-bb5f-cf4c85d600d9\") " pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:48.072420 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:48.072389 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:03:48.191287 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:48.191263 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-52b6g"] Apr 17 08:03:48.193960 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:03:48.193929 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03f7029_34ba_45bd_bb5f_cf4c85d600d9.slice/crio-7feb4827d8fd9806b63716ca6c8ca70fe68bee569a6cb53f5333649bd2f586bd WatchSource:0}: Error finding container 7feb4827d8fd9806b63716ca6c8ca70fe68bee569a6cb53f5333649bd2f586bd: Status 404 returned error can't find the container with id 7feb4827d8fd9806b63716ca6c8ca70fe68bee569a6cb53f5333649bd2f586bd Apr 17 08:03:48.379286 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:48.379191 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-52b6g" event={"ID":"b03f7029-34ba-45bd-bb5f-cf4c85d600d9","Type":"ContainerStarted","Data":"7feb4827d8fd9806b63716ca6c8ca70fe68bee569a6cb53f5333649bd2f586bd"} Apr 17 08:03:51.394261 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:51.394140 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-52b6g" event={"ID":"b03f7029-34ba-45bd-bb5f-cf4c85d600d9","Type":"ContainerStarted","Data":"283f5778cc5ee15896cce3c3ae16ae2e559826874ad1b1741beb891d86b22619"} Apr 17 08:03:51.409728 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:51.409669 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-52b6g" podStartSLOduration=1.627217498 podStartE2EDuration="4.409651278s" podCreationTimestamp="2026-04-17 08:03:47 +0000 UTC" firstStartedPulling="2026-04-17 08:03:48.195161688 +0000 UTC m=+740.684042819" lastFinishedPulling="2026-04-17 08:03:50.977595468 +0000 UTC m=+743.466476599" observedRunningTime="2026-04-17 08:03:51.408536719 +0000 UTC m=+743.897417873" watchObservedRunningTime="2026-04-17 08:03:51.409651278 +0000 UTC m=+743.898532431" Apr 17 08:03:52.397636 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:03:52.397606 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:04:03.403737 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:04:03.403704 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-52b6g" Apr 17 08:06:28.804313 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:06:28.804287 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:06:28.804830 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:06:28.804359 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:09:19.684290 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:19.684254 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt"] Apr 17 08:09:19.688027 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:19.688000 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" Apr 17 08:09:19.690116 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:19.690091 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k26cm\"" Apr 17 08:09:19.696705 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:19.696675 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt"] Apr 17 08:09:19.699281 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:19.699261 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" Apr 17 08:09:19.845109 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:19.845084 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt"] Apr 17 08:09:19.847392 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:09:19.847364 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68632be1_ab88_469c_869b_4e1db08ad84e.slice/crio-382251de87b8e8656d71ddfa3c27f18fd9dfdaadb574a72a6194a38c38003da8 WatchSource:0}: Error finding container 382251de87b8e8656d71ddfa3c27f18fd9dfdaadb574a72a6194a38c38003da8: Status 404 returned error can't find the container with id 382251de87b8e8656d71ddfa3c27f18fd9dfdaadb574a72a6194a38c38003da8 Apr 17 08:09:19.849306 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:19.849287 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:09:20.532863 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:20.532829 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" event={"ID":"68632be1-ab88-469c-869b-4e1db08ad84e","Type":"ContainerStarted","Data":"382251de87b8e8656d71ddfa3c27f18fd9dfdaadb574a72a6194a38c38003da8"} Apr 17 08:09:21.538336 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:21.538303 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" event={"ID":"68632be1-ab88-469c-869b-4e1db08ad84e","Type":"ContainerStarted","Data":"a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3"} Apr 17 08:09:21.538745 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:21.538512 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" Apr 17 08:09:21.540403 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:21.540380 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" Apr 17 08:09:21.553621 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:09:21.553574 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" podStartSLOduration=1.299837162 podStartE2EDuration="2.553559318s" podCreationTimestamp="2026-04-17 08:09:19 +0000 UTC" firstStartedPulling="2026-04-17 08:09:19.849428074 +0000 UTC m=+1072.338309203" lastFinishedPulling="2026-04-17 08:09:21.103150229 +0000 UTC m=+1073.592031359" observedRunningTime="2026-04-17 08:09:21.551545986 +0000 UTC m=+1074.040427139" watchObservedRunningTime="2026-04-17 08:09:21.553559318 +0000 UTC m=+1074.042440517" Apr 17 08:10:54.787006 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:54.786920 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt_68632be1-ab88-469c-869b-4e1db08ad84e/kserve-container/0.log" Apr 17 08:10:55.086890 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.086810 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt"] Apr 17 08:10:55.087137 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.087110 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" podUID="68632be1-ab88-469c-869b-4e1db08ad84e" containerName="kserve-container" containerID="cri-o://a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3" gracePeriod=30 Apr 17 08:10:55.333709 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.333685 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" Apr 17 08:10:55.864156 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.864123 2565 generic.go:358] "Generic (PLEG): container finished" podID="68632be1-ab88-469c-869b-4e1db08ad84e" containerID="a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3" exitCode=2 Apr 17 08:10:55.864597 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.864182 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" Apr 17 08:10:55.864597 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.864242 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" event={"ID":"68632be1-ab88-469c-869b-4e1db08ad84e","Type":"ContainerDied","Data":"a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3"} Apr 17 08:10:55.864597 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.864283 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt" event={"ID":"68632be1-ab88-469c-869b-4e1db08ad84e","Type":"ContainerDied","Data":"382251de87b8e8656d71ddfa3c27f18fd9dfdaadb574a72a6194a38c38003da8"} Apr 17 08:10:55.864597 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.864304 2565 scope.go:117] "RemoveContainer" containerID="a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3" Apr 17 08:10:55.872545 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.872526 2565 scope.go:117] "RemoveContainer" containerID="a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3" Apr 17 08:10:55.872788 ip-10-0-133-228 kubenswrapper[2565]: E0417 08:10:55.872762 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3\": container with ID starting with a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3 not found: ID does not exist" containerID="a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3" Apr 17 08:10:55.872837 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.872798 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3"} err="failed to get container status \"a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3\": rpc error: code = NotFound desc = could not find container \"a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3\": container with ID starting with a7d29e8fd6b6efea371e2c1ecf9c76f611464c1920343253af4cc7d7f267b2d3 not found: ID does not exist" Apr 17 08:10:55.884285 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.884261 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt"] Apr 17 08:10:55.887495 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:55.887473 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-0e2b3-predictor-868ff969d8-tz4lt"] Apr 17 08:10:56.043708 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:10:56.043663 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68632be1-ab88-469c-869b-4e1db08ad84e" path="/var/lib/kubelet/pods/68632be1-ab88-469c-869b-4e1db08ad84e/volumes" Apr 17 08:11:28.831194 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:11:28.831165 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:11:28.832543 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:11:28.832521 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:16:28.859073 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:16:28.859042 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:16:28.862172 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:16:28.862149 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:18:20.127428 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:20.127398 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mzn5t_a1d097b8-d651-4c5a-aee8-e970c942c7bd/global-pull-secret-syncer/0.log" Apr 17 08:18:20.204090 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:20.204060 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p5wkz_ec96c0de-0a97-4ee9-91ef-9bcb0e0c73d3/konnectivity-agent/0.log" Apr 17 08:18:20.293344 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:20.293307 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-228.ec2.internal_6fa2d3d4e5da99e17b218f9fc59a91d2/haproxy/0.log" Apr 17 08:18:23.413372 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.413338 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c74ad00b-b762-4945-ac23-5e04425bd6cd/alertmanager/0.log" Apr 17 08:18:23.441550 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.441525 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c74ad00b-b762-4945-ac23-5e04425bd6cd/config-reloader/0.log" Apr 17 08:18:23.472828 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.472807 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c74ad00b-b762-4945-ac23-5e04425bd6cd/kube-rbac-proxy-web/0.log" Apr 17 08:18:23.495672 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.495646 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c74ad00b-b762-4945-ac23-5e04425bd6cd/kube-rbac-proxy/0.log" Apr 17 08:18:23.531151 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.531115 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c74ad00b-b762-4945-ac23-5e04425bd6cd/kube-rbac-proxy-metric/0.log" Apr 17 08:18:23.558698 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.558673 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c74ad00b-b762-4945-ac23-5e04425bd6cd/prom-label-proxy/0.log" Apr 17 08:18:23.581962 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.581933 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c74ad00b-b762-4945-ac23-5e04425bd6cd/init-config-reloader/0.log" Apr 17 08:18:23.660295 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.660264 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9czgc_02065558-0ec1-4075-aac4-36ffc7ebb493/kube-state-metrics/0.log" Apr 17 08:18:23.679957 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.679881 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9czgc_02065558-0ec1-4075-aac4-36ffc7ebb493/kube-rbac-proxy-main/0.log" Apr 17 08:18:23.699866 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.699840 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9czgc_02065558-0ec1-4075-aac4-36ffc7ebb493/kube-rbac-proxy-self/0.log" Apr 17 08:18:23.724117 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.724093 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7b94f87fff-dbzqr_60c78173-0f99-48c3-abbb-50b961b43510/metrics-server/0.log" Apr 17 08:18:23.752659 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.752628 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4c4gp_1697098b-61c4-4dbd-b7d3-6338c7f5e46c/monitoring-plugin/0.log" Apr 17 08:18:23.858002 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.857967 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gdn8b_cc81e894-4be4-42e1-8d62-b69a7b840a45/node-exporter/0.log" Apr 17 08:18:23.879733 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.879705 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gdn8b_cc81e894-4be4-42e1-8d62-b69a7b840a45/kube-rbac-proxy/0.log" Apr 17 08:18:23.903461 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:23.903430 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gdn8b_cc81e894-4be4-42e1-8d62-b69a7b840a45/init-textfile/0.log" Apr 17 08:18:24.345524 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:24.345495 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-pf7jc_eaa1d24a-6baa-4384-96d4-2aa7a6b5734a/prometheus-operator-admission-webhook/0.log" Apr 17 08:18:25.903922 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:25.903891 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-tfd6g_b2cc8f03-c0c9-465b-b46c-f4c6b89b56e3/networking-console-plugin/0.log" Apr 17 08:18:26.701824 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:26.701732 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cf69fd875-4t7rp_a9b3278f-7ef0-483b-8da5-89d782a41536/console/0.log" Apr 17 08:18:27.060659 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.060625 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj"] Apr 17 08:18:27.061160 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.061139 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68632be1-ab88-469c-869b-4e1db08ad84e" containerName="kserve-container" Apr 17 08:18:27.061256 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.061167 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="68632be1-ab88-469c-869b-4e1db08ad84e" containerName="kserve-container" Apr 17 08:18:27.061318 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.061301 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="68632be1-ab88-469c-869b-4e1db08ad84e" containerName="kserve-container" Apr 17 08:18:27.064406 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.064385 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.066452 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.066427 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nl5sm\"/\"kube-root-ca.crt\"" Apr 17 08:18:27.066452 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.066438 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nl5sm\"/\"default-dockercfg-r66q7\"" Apr 17 08:18:27.067052 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.067028 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nl5sm\"/\"openshift-service-ca.crt\"" Apr 17 08:18:27.070105 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.070080 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj"] Apr 17 08:18:27.252957 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.252920 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-podres\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.252957 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.252964 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-proc\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.253169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.253011 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg9vz\" (UniqueName: \"kubernetes.io/projected/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-kube-api-access-pg9vz\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.253169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.253034 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-lib-modules\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.253169 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.253133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-sys\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354238 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354126 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-sys\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354238 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354189 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-podres\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354459 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354255 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-proc\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354459 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354269 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-sys\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354459 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354298 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg9vz\" (UniqueName: \"kubernetes.io/projected/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-kube-api-access-pg9vz\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354459 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-proc\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354459 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354333 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-lib-modules\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354459 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354373 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-podres\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.354667 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.354458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-lib-modules\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.361453 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.361431 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg9vz\" (UniqueName: \"kubernetes.io/projected/b0f0b2f3-39fc-42f5-af67-2dbeb4172130-kube-api-access-pg9vz\") pod \"perf-node-gather-daemonset-m6vjj\" (UID: \"b0f0b2f3-39fc-42f5-af67-2dbeb4172130\") " pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.374523 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.374496 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:27.498337 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.498305 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj"] Apr 17 08:18:27.502533 ip-10-0-133-228 kubenswrapper[2565]: W0417 08:18:27.502507 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb0f0b2f3_39fc_42f5_af67_2dbeb4172130.slice/crio-b8c39cf43ab51117eeec557718d9ce65d68f5514595934e4d70218e11bf714a2 WatchSource:0}: Error finding container b8c39cf43ab51117eeec557718d9ce65d68f5514595934e4d70218e11bf714a2: Status 404 returned error can't find the container with id b8c39cf43ab51117eeec557718d9ce65d68f5514595934e4d70218e11bf714a2 Apr 17 08:18:27.504510 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.504488 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:18:27.868401 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.868360 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-z5xmb_7b550782-b0e2-4efb-9013-806a1ec8d616/dns/0.log" Apr 17 08:18:27.887866 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.887788 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-z5xmb_7b550782-b0e2-4efb-9013-806a1ec8d616/kube-rbac-proxy/0.log" Apr 17 08:18:27.930398 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:27.930370 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wcz5n_f2e063f6-7071-4323-a26c-9d5f28ce786e/dns-node-resolver/0.log" Apr 17 08:18:28.381964 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:28.381929 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7fbcd8fb7b-hvmtr_77062eb3-1bc4-4732-a8e0-c5a9253bbac5/registry/0.log" Apr 17 08:18:28.414665 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:28.414636 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" event={"ID":"b0f0b2f3-39fc-42f5-af67-2dbeb4172130","Type":"ContainerStarted","Data":"9c93946f26991ef585ac3153f9d7c5e27f5c6044193448f38abc53f024bef26c"} Apr 17 08:18:28.414665 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:28.414670 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" event={"ID":"b0f0b2f3-39fc-42f5-af67-2dbeb4172130","Type":"ContainerStarted","Data":"b8c39cf43ab51117eeec557718d9ce65d68f5514595934e4d70218e11bf714a2"} Apr 17 08:18:28.414861 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:28.414773 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:28.422134 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:28.422109 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mtwj6_b2e0178e-22bf-4ec0-8752-a62c91d1d7a5/node-ca/0.log" Apr 17 08:18:28.431024 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:28.430984 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" podStartSLOduration=1.430971255 podStartE2EDuration="1.430971255s" podCreationTimestamp="2026-04-17 08:18:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:18:28.428684447 +0000 UTC m=+1620.917565634" watchObservedRunningTime="2026-04-17 08:18:28.430971255 +0000 UTC m=+1620.919852406" Apr 17 08:18:29.113040 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:29.113012 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-779d7cd6d-kbl8s_d999a14b-e053-4d0b-8b72-526fefe663ca/router/0.log" Apr 17 08:18:29.420505 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:29.420416 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-62zbn_66caf165-b357-465a-87dc-24e5229f236e/serve-healthcheck-canary/0.log" Apr 17 08:18:29.832815 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:29.832782 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qlgcs_27196395-61d5-4866-b7d2-ebf227547861/insights-operator/1.log" Apr 17 08:18:29.833018 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:29.832961 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qlgcs_27196395-61d5-4866-b7d2-ebf227547861/insights-operator/0.log" Apr 17 08:18:29.920093 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:29.920056 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v8457_5f74b0a4-4315-46f8-a669-be2646461e18/kube-rbac-proxy/0.log" Apr 17 08:18:29.939762 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:29.939736 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v8457_5f74b0a4-4315-46f8-a669-be2646461e18/exporter/0.log" Apr 17 08:18:29.960021 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:29.959994 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v8457_5f74b0a4-4315-46f8-a669-be2646461e18/extractor/0.log" Apr 17 08:18:31.944321 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:31.944276 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-558564fd68-z6v8s_b691469f-c21a-4775-a20e-6e3bea432807/manager/0.log" Apr 17 08:18:31.962844 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:31.962819 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-q7s8x_4c041ea6-e048-48f4-8830-64e163f148ff/manager/0.log" Apr 17 08:18:32.115258 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:32.115199 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-52b6g_b03f7029-34ba-45bd-bb5f-cf4c85d600d9/manager/0.log" Apr 17 08:18:34.427364 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:34.427335 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nl5sm/perf-node-gather-daemonset-m6vjj" Apr 17 08:18:36.152228 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:36.152116 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qfdj6_cdcde5b0-5ef1-4fd5-b0b7-de55988110a6/kube-storage-version-migrator-operator/1.log" Apr 17 08:18:36.154100 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:36.154039 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qfdj6_cdcde5b0-5ef1-4fd5-b0b7-de55988110a6/kube-storage-version-migrator-operator/0.log" Apr 17 08:18:37.142679 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.142650 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9htcw_7e100a52-e772-4d62-a573-6f5b62a4671d/kube-multus-additional-cni-plugins/0.log" Apr 17 08:18:37.162444 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.162412 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9htcw_7e100a52-e772-4d62-a573-6f5b62a4671d/egress-router-binary-copy/0.log" Apr 17 08:18:37.182242 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.182146 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9htcw_7e100a52-e772-4d62-a573-6f5b62a4671d/cni-plugins/0.log" Apr 17 08:18:37.201481 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.201448 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9htcw_7e100a52-e772-4d62-a573-6f5b62a4671d/bond-cni-plugin/0.log" Apr 17 08:18:37.221278 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.221253 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9htcw_7e100a52-e772-4d62-a573-6f5b62a4671d/routeoverride-cni/0.log" Apr 17 08:18:37.240726 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.240697 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9htcw_7e100a52-e772-4d62-a573-6f5b62a4671d/whereabouts-cni-bincopy/0.log" Apr 17 08:18:37.263179 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.263154 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9htcw_7e100a52-e772-4d62-a573-6f5b62a4671d/whereabouts-cni/0.log" Apr 17 08:18:37.608851 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.608817 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bgmzb_b8c32bd7-8e6a-401c-93fe-49b96703cd7a/kube-multus/0.log" Apr 17 08:18:37.694146 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.694119 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k6mnq_63918c32-1f1d-43f2-9243-76c8cb35d556/network-metrics-daemon/0.log" Apr 17 08:18:37.714363 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:37.714331 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k6mnq_63918c32-1f1d-43f2-9243-76c8cb35d556/kube-rbac-proxy/0.log" Apr 17 08:18:38.858333 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.858301 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-controller/0.log" Apr 17 08:18:38.874523 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.874486 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/0.log" Apr 17 08:18:38.889356 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.889324 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovn-acl-logging/1.log" Apr 17 08:18:38.913397 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.913369 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/kube-rbac-proxy-node/0.log" Apr 17 08:18:38.938405 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.938365 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:18:38.955044 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.955014 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/northd/0.log" Apr 17 08:18:38.976154 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.976124 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/nbdb/0.log" Apr 17 08:18:38.997338 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:38.997307 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/sbdb/0.log" Apr 17 08:18:39.172768 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:39.172674 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pqdwt_c3bd4b80-ecb6-4dd0-a2e6-88f1d0f483d2/ovnkube-controller/0.log" Apr 17 08:18:40.414579 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:40.414550 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-z7tsc_919fa45a-692a-4f75-a7ff-12f0085459ab/network-check-target-container/0.log" Apr 17 08:18:41.359135 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:41.359094 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8vfz7_7c70fa88-6394-4033-b304-0ea283b0a7eb/iptables-alerter/0.log" Apr 17 08:18:41.995130 ip-10-0-133-228 kubenswrapper[2565]: I0417 08:18:41.995095 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-rntgv_753ccc51-1724-4403-97ba-abea000798cc/tuned/0.log"