Apr 22 18:46:43.550274 ip-10-0-133-163 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:43.984408 ip-10-0-133-163 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:43.984408 ip-10-0-133-163 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:43.984408 ip-10-0-133-163 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:43.984408 ip-10-0-133-163 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:43.984408 ip-10-0-133-163 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:43.986038 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.985926 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:43.991495 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991461 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.991495 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991486 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:43.991495 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991491 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:43.991495 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991494 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:43.991495 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991497 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.991495 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991501 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:43.991495 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991504 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991508 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991512 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991515 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991518 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991521 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991524 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991530 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991533 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991536 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991539 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991542 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991546 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991548 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991551 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991554 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991557 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991560 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991563 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991568 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.991765 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991571 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991574 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991577 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991579 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991582 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991585 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991588 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991591 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991593 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991596 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991599 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991602 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991607 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991610 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991615 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991618 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991622 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991625 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991628 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991631 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:43.992273 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991634 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991636 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991639 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991642 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991647 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991650 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991653 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991656 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991658 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991661 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991665 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991669 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991673 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991676 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991679 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991681 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991687 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991689 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991693 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.992784 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991699 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991702 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991705 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991707 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991710 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991713 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991717 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991720 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991726 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991729 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991732 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991734 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991737 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991740 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991742 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991745 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991748 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991750 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991753 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991756 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:43.993267 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.991759 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992447 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992455 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992458 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992461 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992464 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992467 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992470 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992479 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992482 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992485 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992488 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992490 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992496 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992499 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992502 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992504 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992507 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992510 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992513 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:43.993774 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992524 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992527 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992531 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992535 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992537 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992543 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992546 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992548 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992551 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992553 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992556 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992559 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992562 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992566 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992570 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992573 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992576 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992582 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992584 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:43.994282 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992587 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992590 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992592 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992596 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992606 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992609 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992611 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992614 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992617 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992620 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992625 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992627 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992630 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992632 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992641 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992644 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992646 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992649 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992651 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992654 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:43.994779 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992657 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992659 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992662 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992667 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992669 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992672 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992674 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992677 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992680 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992682 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992685 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992689 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992691 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992694 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992697 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992700 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992705 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992708 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992711 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992713 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992717 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.995296 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992720 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992723 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992725 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992728 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992731 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992734 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.992744 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993142 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993169 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993179 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993186 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993192 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993197 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993203 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993210 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993215 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993220 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993225 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993230 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993235 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993239 2570 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993244 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993248 2570 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:43.995815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993252 2570 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993257 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.993262 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994617 2570 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994622 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994625 2570 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994629 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994633 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994637 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994640 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994643 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994647 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994650 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994653 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994656 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994659 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994663 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994669 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994673 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994676 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994679 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994682 2570 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994686 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994691 2570 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994694 2570 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:43.996385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994697 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994700 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994703 2570 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994712 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994715 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994718 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994721 2570 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994724 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994727 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994730 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994733 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994736 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994739 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994742 2570 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994746 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994749 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994752 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994756 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994759 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994763 2570 flags.go:64] FLAG: --help="false" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994765 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-133-163.ec2.internal" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994769 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994772 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:43.997007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994775 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994779 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994783 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994786 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994790 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994793 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994797 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994800 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994803 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994806 2570 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994809 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994812 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994815 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994818 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994821 2570 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994824 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994827 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994830 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994836 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994839 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994842 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994845 2570 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994848 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994851 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:43.997577 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994854 2570 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994857 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994863 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994866 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994871 2570 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994885 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994889 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994893 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994897 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994906 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994910 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994913 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994922 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994925 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994928 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994932 2570 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994935 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994941 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994944 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994947 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994950 2570 flags.go:64] FLAG: --port="10250" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994953 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994956 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0314bc7f860c1c57e" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994959 2570 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:43.998170 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994962 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994965 2570 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994968 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994971 2570 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994974 2570 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994977 2570 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994980 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994983 2570 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994987 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994990 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994993 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994996 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.994998 2570 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995002 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995006 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995009 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995025 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995030 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995033 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995037 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995040 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995043 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995046 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995048 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995052 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995055 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:43.998749 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995058 2570 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995061 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995067 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995070 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995073 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995079 2570 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995081 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995084 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995087 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995090 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995093 2570 flags.go:64] FLAG: --v="2" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995098 2570 flags.go:64] FLAG: --version="false" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995102 2570 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995106 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:43.995110 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995218 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995222 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995225 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995228 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995232 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995235 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995237 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995240 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.999390 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995244 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995247 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995249 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995252 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995254 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995257 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995259 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995262 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995265 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995268 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995270 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995274 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995278 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995281 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995283 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995287 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995291 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995294 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995297 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.999972 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995300 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995303 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995306 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995308 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995311 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995314 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995316 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995319 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995322 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995324 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995327 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995329 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995332 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995336 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995339 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995342 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995344 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995347 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995349 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:44.000470 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995352 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995354 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995357 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995360 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995362 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995365 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995367 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995370 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995372 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995375 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995377 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995380 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995382 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995385 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995388 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995390 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995393 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995395 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995398 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995400 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:44.000931 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995403 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995405 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995408 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995410 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995413 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995416 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995419 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995422 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995424 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995427 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995429 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995432 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995434 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995437 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995439 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995442 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995445 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995448 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995451 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:44.001430 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:43.995453 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:44.001901 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.000091 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:44.006745 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.006719 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:44.006745 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.006741 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006812 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006822 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006827 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006832 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006836 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006841 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006845 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006850 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006854 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006859 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006863 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006867 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006872 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006876 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006880 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006885 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006889 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006893 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006898 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:44.006905 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006902 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006907 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006913 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006917 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006921 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006926 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006930 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006934 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006939 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006943 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006947 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006954 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006962 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006967 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006971 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006975 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006979 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006983 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006988 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006992 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:44.007813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.006996 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007001 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007005 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007009 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007029 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007034 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007038 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007042 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007047 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007050 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007055 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007058 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007062 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007067 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007073 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007077 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007081 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007086 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007090 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:44.008478 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007095 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007099 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007104 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007108 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007112 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007117 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007122 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007127 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007131 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007138 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007143 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007148 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007153 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007157 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007162 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007166 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007170 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007175 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007179 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007183 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:44.009091 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007187 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007192 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007196 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007200 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007205 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007210 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007214 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007219 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.007227 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007402 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007412 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007416 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007421 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007426 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007431 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:44.009834 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007435 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007440 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007444 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007448 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007452 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007457 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007461 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007465 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007470 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007474 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007478 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007482 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007486 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007490 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007495 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007499 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007503 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007507 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007511 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007515 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:44.010482 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007519 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007523 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007528 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007532 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007536 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007540 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007544 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007548 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007552 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007556 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007560 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007565 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007569 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007573 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007577 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007581 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007585 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007591 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007596 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:44.011135 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007600 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007604 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007609 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007613 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007617 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007622 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007626 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007630 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007634 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007638 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007642 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007646 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007650 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007654 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007659 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007663 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007668 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007672 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007676 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007680 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:44.011723 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007685 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007689 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007696 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007702 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007709 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007714 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007719 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007723 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007727 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007731 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007736 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007741 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007745 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007750 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007754 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007758 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007762 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007766 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007770 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007775 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:44.012454 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:44.007779 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:44.012943 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.007788 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:44.012943 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.008515 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:44.012943 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.011330 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:44.012943 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.012234 2570 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:44.012943 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.012333 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:44.012943 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.012368 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:44.039249 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.039224 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:44.042585 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.042566 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:44.056509 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.056487 2570 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:44.063194 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.063170 2570 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:44.064711 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.064693 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:44.064899 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.064884 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:44.070308 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.070286 2570 fs.go:135] Filesystem UUIDs: map[6c24f2e7-bd46-4b44-9a4e-6a1b80244b9e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a2192a1d-6971-48bb-9d9c-bd3f94908380:/dev/nvme0n1p3] Apr 22 18:46:44.070388 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.070309 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:44.076177 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.076067 2570 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:44.074177725 +0000 UTC m=+0.405686168 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3141712 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec218ac7eb8a12af9e72aa9dc860a0f5 SystemUUID:ec218ac7-eb8a-12af-9e72-aa9dc860a0f5 BootID:fccc51f3-8919-4fb6-b7fc-78343fbe2399 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0e:39:cc:87:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0e:39:cc:87:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:21:b0:1d:02:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:44.076177 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.076166 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:44.076341 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.076282 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:44.077240 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.077214 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:44.077422 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.077242 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-163.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:44.077498 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.077436 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:44.077498 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.077448 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:44.077498 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.077467 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:44.078266 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.078253 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:44.079077 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.079066 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:44.079200 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.079190 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:44.082275 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.082265 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:44.082345 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.082289 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:44.082345 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.082309 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:44.082345 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.082325 2570 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:44.082469 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.082350 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:44.083399 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.083386 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:44.083469 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.083413 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:44.086240 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.086222 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ngtbn" Apr 22 18:46:44.086481 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.086465 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:44.087744 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.087730 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:44.089313 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089301 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089319 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089326 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089332 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089338 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089344 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089350 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089356 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:44.089362 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089366 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:44.089592 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089372 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:44.089592 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089385 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:44.089650 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.089630 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:44.090458 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.090444 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:44.090496 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.090461 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:44.092790 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.092762 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-163.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:44.093365 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.093263 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:44.093473 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.093435 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-163.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:44.094385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.094255 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ngtbn" Apr 22 18:46:44.095320 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.095281 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:44.095647 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.095634 2570 server.go:1295] "Started kubelet" Apr 22 18:46:44.095878 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.095753 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:44.095936 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.095836 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:44.095973 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.095957 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:44.096746 ip-10-0-133-163 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:44.097397 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.097368 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:44.098615 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.098600 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:44.101906 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.100432 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-163.ec2.internal.18a8c2393d51980b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-163.ec2.internal,UID:ip-10-0-133-163.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-163.ec2.internal,},FirstTimestamp:2026-04-22 18:46:44.095342603 +0000 UTC m=+0.426851034,LastTimestamp:2026-04-22 18:46:44.095342603 +0000 UTC m=+0.426851034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-163.ec2.internal,}" Apr 22 18:46:44.103902 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.103886 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:44.104944 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.104931 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:44.106172 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.106147 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:44.106772 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.106753 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:44.106863 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.106775 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:44.106863 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.106753 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:44.106961 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.106913 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:44.106961 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.106922 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:44.107089 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.107004 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.107218 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.107203 2570 factory.go:55] Registering systemd factory Apr 22 18:46:44.107256 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.107231 2570 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:44.107484 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.107470 2570 factory.go:153] Registering CRI-O factory Apr 22 18:46:44.107548 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.107486 2570 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:44.107548 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.107535 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:44.107642 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.107568 2570 factory.go:103] Registering Raw factory Apr 22 18:46:44.107642 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.107586 2570 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:44.108171 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.108157 2570 manager.go:319] Starting recovery of all containers Apr 22 18:46:44.118460 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.118323 2570 manager.go:324] Recovery completed Apr 22 18:46:44.119542 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.119516 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:44.122160 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.122139 2570 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:46:44.123526 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.123510 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-163.ec2.internal\" not found" node="ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.125142 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.125131 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:44.127604 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.127591 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:44.127664 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.127616 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:44.127664 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.127626 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:44.128175 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.128162 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:44.128231 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.128175 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:44.128231 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.128194 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:44.130403 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.130389 2570 policy_none.go:49] "None policy: Start" Apr 22 18:46:44.130437 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.130410 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:44.130437 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.130423 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:44.173616 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.173595 2570 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.173636 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.173647 2570 server.go:85] "Starting device plugin registration server" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.173873 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.173883 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.173992 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.174088 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.174096 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.174563 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:44.180991 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.174598 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.241897 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.241821 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:44.242992 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.242973 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:44.243115 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.243034 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:44.243115 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.243054 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:44.243115 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.243060 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:44.243115 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.243095 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:44.245628 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.245612 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:44.274726 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.274708 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:44.275623 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.275606 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:44.275707 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.275645 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:44.275707 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.275663 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:44.275707 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.275694 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.284188 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.284174 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.284271 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.284194 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-163.ec2.internal\": node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.313678 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.313660 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.343322 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.343298 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal"] Apr 22 18:46:44.343394 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.343370 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:44.344757 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.344742 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:44.344844 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.344776 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:44.344844 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.344790 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:44.346157 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346142 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:44.346315 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346301 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.346367 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346330 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:44.346813 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346798 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:44.346872 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346800 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:44.346872 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346852 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:44.346872 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346864 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:44.347028 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346824 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:44.347028 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.346912 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:44.347940 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.347926 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.347999 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.347952 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:44.348650 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.348636 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:44.348723 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.348684 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:44.348723 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.348699 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:44.373746 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.373726 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-163.ec2.internal\" not found" node="ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.378136 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.378121 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-163.ec2.internal\" not found" node="ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.407867 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.407842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.407867 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.407870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.408085 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.407888 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8147dca2f1846ffe58ac40c8a9cdfc0b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-163.ec2.internal\" (UID: \"8147dca2f1846ffe58ac40c8a9cdfc0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.413740 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.413718 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.508546 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.508470 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.508546 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.508505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.508546 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.508523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8147dca2f1846ffe58ac40c8a9cdfc0b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-163.ec2.internal\" (UID: \"8147dca2f1846ffe58ac40c8a9cdfc0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.508546 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.508543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.508745 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.508562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.508745 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.508547 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8147dca2f1846ffe58ac40c8a9cdfc0b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-163.ec2.internal\" (UID: \"8147dca2f1846ffe58ac40c8a9cdfc0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.514551 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.514530 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.615313 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.615276 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.675488 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.675464 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.681038 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.680999 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 22 18:46:44.716041 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.716004 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.816604 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:44.816500 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 22 18:46:44.907124 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:44.907093 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.007411 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.007377 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:45.012116 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.012100 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:45.012243 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.012224 2570 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://af2f49ad88e0c40219fa5bb2645ef118-8d5ba8e97fbf2710.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.133.163:50432->34.195.103.195:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:46:45.012287 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.012246 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 22 18:46:45.012287 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.012259 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:45.012352 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.012283 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:45.012352 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.012259 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:45.028399 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.028375 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:45.083037 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.082968 2570 apiserver.go:52] "Watching apiserver" Apr 22 18:46:45.095099 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.095079 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:45.096375 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.096353 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-wcprf","openshift-ovn-kubernetes/ovnkube-node-tlrd2","kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj","openshift-cluster-node-tuning-operator/tuned-7rzxj","openshift-dns/node-resolver-cpq2l","openshift-image-registry/node-ca-lxsx5","openshift-multus/multus-47h8f","openshift-multus/network-metrics-daemon-mf94f","openshift-network-diagnostics/network-check-target-57n8t","kube-system/konnectivity-agent-kzsjh","openshift-multus/multus-additional-cni-plugins-c74jc"] Apr 22 18:46:45.096450 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.096366 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:44 +0000 UTC" deadline="2027-12-21 12:06:11.028751406 +0000 UTC" Apr 22 18:46:45.096450 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.096387 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14585h19m25.932367072s" Apr 22 18:46:45.101032 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.099642 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.102992 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.102966 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.103858 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.103836 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.103995 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.103901 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:45.104087 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.104065 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.104165 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.104102 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-g9xd4\"" Apr 22 18:46:45.105147 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.105131 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:45.105279 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.105264 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.105593 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.105575 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:45.105698 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.105678 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:45.107067 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107049 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:45.107438 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107418 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.107606 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107583 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sbgr7\"" Apr 22 18:46:45.107606 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107596 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:45.107726 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107591 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.107897 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107877 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.107991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107905 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.107991 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.107934 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pd4b9\"" Apr 22 18:46:45.108657 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.108640 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:45.108753 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.108667 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.109641 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.109626 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.109735 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.109625 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fvffw\"" Apr 22 18:46:45.111136 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.111122 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.111669 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.111653 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.111859 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.111844 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.111930 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.111913 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lmkzb\"" Apr 22 18:46:45.111985 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.111939 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.112044 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.111990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.112214 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112183 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-sys-fs\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.112305 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnx49\" (UniqueName: \"kubernetes.io/projected/a26938d0-f0cc-41e8-a082-51c312992e57-kube-api-access-gnx49\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.112305 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112239 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysctl-d\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112305 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112262 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-host\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112305 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/814fc44d-7767-426d-8e7f-760f0016f42f-tmp\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dshm\" (UniqueName: \"kubernetes.io/projected/587139e8-f488-4657-8806-34d257b2339c-kube-api-access-5dshm\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112329 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-systemd\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112346 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysctl-conf\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-sys\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112382 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-var-lib-kubelet\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-systemd-units\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112413 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-etc-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvggm\" (UniqueName: \"kubernetes.io/projected/814fc44d-7767-426d-8e7f-760f0016f42f-kube-api-access-tvggm\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-log-socket\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-cni-bin\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-cni-netd\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-env-overrides\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112534 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovnkube-script-lib\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112569 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-modprobe-d\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112598 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-ovn\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112621 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovn-node-metrics-cert\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112638 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-registration-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112652 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-etc-selinux\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-kubelet\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-run-netns\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.112704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112709 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-var-lib-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112724 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-node-log\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-run-ovn-kubernetes\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhr8l\" (UniqueName: \"kubernetes.io/projected/4f7c20eb-284f-4276-b58c-ed5062c1325e-kube-api-access-rhr8l\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysconfig\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112820 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-run\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-slash\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112882 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-socket-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112904 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-kubernetes\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-systemd\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-lib-modules\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112952 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/587139e8-f488-4657-8806-34d257b2339c-host-slash\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112978 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovnkube-config\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.112999 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.113048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/814fc44d-7767-426d-8e7f-760f0016f42f-etc-tuned\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.113396 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.113073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/587139e8-f488-4657-8806-34d257b2339c-iptables-alerter-script\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.113898 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.113095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-device-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.114116 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.114099 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zv54v\"" Apr 22 18:46:45.114232 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.114219 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.114298 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.114223 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.114298 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.114244 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:45.115083 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.115068 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.116595 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.116574 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:45.116694 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.116601 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:45.116787 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.116701 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:45.116787 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.116735 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:45.116918 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.116813 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9whsv\"" Apr 22 18:46:45.116918 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.116834 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.116918 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.116845 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.122203 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.121803 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.122311 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.122162 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:45.122311 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.122299 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:46:45.124387 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.124368 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:45.124705 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.124666 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.124705 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.124697 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8lqh6\"" Apr 22 18:46:45.125907 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.125890 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:45.127463 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.127439 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:45.127541 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.127503 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4jknq\"" Apr 22 18:46:45.127541 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.127517 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:45.129901 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.129886 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:45.180934 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.180909 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tkp5p" Apr 22 18:46:45.186756 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.186735 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tkp5p" Apr 22 18:46:45.208216 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.208197 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:45.213352 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213327 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysctl-d\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.213454 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dshm\" (UniqueName: \"kubernetes.io/projected/587139e8-f488-4657-8806-34d257b2339c-kube-api-access-5dshm\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.213454 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-sys\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.213563 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-var-lib-kubelet\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.213563 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-etc-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.213563 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-daemon-config\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.213684 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213569 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-os-release\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.213684 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213573 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-var-lib-kubelet\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.213684 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213590 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-etc-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.213684 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysctl-d\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.213684 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-cni-bin\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.213902 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213706 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-env-overrides\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.213902 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-sys\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.213902 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-cni-bin\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.213902 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213776 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovnkube-script-lib\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.213902 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-conf-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.214228 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213936 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/188e6686-56b2-4173-8c6e-37c8297781e8-agent-certs\") pod \"konnectivity-agent-kzsjh\" (UID: \"188e6686-56b2-4173-8c6e-37c8297781e8\") " pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.214228 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.213990 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e190ea4b-606c-4dd3-9785-eb0178af92e9-hosts-file\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.214228 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lcgc\" (UniqueName: \"kubernetes.io/projected/e190ea4b-606c-4dd3-9785-eb0178af92e9-kube-api-access-6lcgc\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.214228 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-modprobe-d\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.214228 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovn-node-metrics-cert\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214228 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214182 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-env-overrides\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214228 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-modprobe-d\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214270 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-kubelet\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-run-netns\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214321 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-kubelet\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-var-lib-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-var-lib-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214367 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovnkube-script-lib\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-run-netns\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-node-log\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214468 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:45.214529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214502 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-cni-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214539 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-netns\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-node-log\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysconfig\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214619 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysconfig\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214657 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-slash\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06ac6726-1f7b-4981-a05c-4538095c85b7-cni-binary-copy\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-k8s-cni-cncf-io\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214737 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/188e6686-56b2-4173-8c6e-37c8297781e8-konnectivity-ca\") pod \"konnectivity-agent-kzsjh\" (UID: \"188e6686-56b2-4173-8c6e-37c8297781e8\") " pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-slash\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-cni-binary-copy\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214802 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-systemd\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214827 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-lib-modules\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214850 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovnkube-config\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-systemd\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-sys-fs\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.214970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-sys-fs\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-cnibin\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214954 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-system-cni-dir\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/814fc44d-7767-426d-8e7f-760f0016f42f-etc-tuned\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214991 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/587139e8-f488-4657-8806-34d257b2339c-iptables-alerter-script\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215027 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-device-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lldg7\" (UniqueName: \"kubernetes.io/projected/383fe532-b742-451a-8a94-dc5c7fd3fce5-kube-api-access-lldg7\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrctf\" (UniqueName: \"kubernetes.io/projected/94116840-a7e4-4953-83a7-56e00b343c31-kube-api-access-zrctf\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-host\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/814fc44d-7767-426d-8e7f-760f0016f42f-tmp\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215112 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-systemd\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-kubelet\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysctl-conf\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-systemd-units\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215173 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-cni-bin\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0484c97f-ca09-4491-bd36-1cd68e364f27-host\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvggm\" (UniqueName: \"kubernetes.io/projected/814fc44d-7767-426d-8e7f-760f0016f42f-kube-api-access-tvggm\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.215766 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215235 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-log-socket\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-cni-netd\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-os-release\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215301 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-socket-dir-parent\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215317 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-multus-certs\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215333 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-cnibin\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-ovn\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-registration-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/587139e8-f488-4657-8806-34d257b2339c-iptables-alerter-script\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215371 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovnkube-config\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-etc-selinux\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjgp\" (UniqueName: \"kubernetes.io/projected/06ac6726-1f7b-4981-a05c-4538095c85b7-kube-api-access-vsjgp\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.214991 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-lib-modules\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0484c97f-ca09-4491-bd36-1cd68e364f27-serviceca\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-etc-selinux\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-run-ovn-kubernetes\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.216584 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215498 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-device-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhr8l\" (UniqueName: \"kubernetes.io/projected/4f7c20eb-284f-4276-b58c-ed5062c1325e-kube-api-access-rhr8l\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-cni-multus\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215605 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-host\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215644 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-openvswitch\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-sysctl-conf\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215679 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-log-socket\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-cni-netd\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-systemd-units\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-run-ovn-kubernetes\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-registration-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215754 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-hostroot\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215766 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-systemd\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-run-ovn\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215845 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-run\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215918 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-run\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.217386 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-socket-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.215995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216030 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-kubernetes\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216089 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f7c20eb-284f-4276-b58c-ed5062c1325e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-socket-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216141 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/814fc44d-7767-426d-8e7f-760f0016f42f-etc-kubernetes\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/587139e8-f488-4657-8806-34d257b2339c-host-slash\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnx49\" (UniqueName: \"kubernetes.io/projected/a26938d0-f0cc-41e8-a082-51c312992e57-kube-api-access-gnx49\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216244 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/587139e8-f488-4657-8806-34d257b2339c-host-slash\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216265 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-etc-kubernetes\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e190ea4b-606c-4dd3-9785-eb0178af92e9-tmp-dir\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-system-cni-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a26938d0-f0cc-41e8-a082-51c312992e57-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.218050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.216405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9c8\" (UniqueName: \"kubernetes.io/projected/0484c97f-ca09-4491-bd36-1cd68e364f27-kube-api-access-dv9c8\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.218520 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.218044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/814fc44d-7767-426d-8e7f-760f0016f42f-etc-tuned\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.218520 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.218079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/814fc44d-7767-426d-8e7f-760f0016f42f-tmp\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.218520 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.218144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f7c20eb-284f-4276-b58c-ed5062c1325e-ovn-node-metrics-cert\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.221744 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.221718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dshm\" (UniqueName: \"kubernetes.io/projected/587139e8-f488-4657-8806-34d257b2339c-kube-api-access-5dshm\") pod \"iptables-alerter-wcprf\" (UID: \"587139e8-f488-4657-8806-34d257b2339c\") " pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.224493 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.224392 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhr8l\" (UniqueName: \"kubernetes.io/projected/4f7c20eb-284f-4276-b58c-ed5062c1325e-kube-api-access-rhr8l\") pod \"ovnkube-node-tlrd2\" (UID: \"4f7c20eb-284f-4276-b58c-ed5062c1325e\") " pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.224589 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.224526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvggm\" (UniqueName: \"kubernetes.io/projected/814fc44d-7767-426d-8e7f-760f0016f42f-kube-api-access-tvggm\") pod \"tuned-7rzxj\" (UID: \"814fc44d-7767-426d-8e7f-760f0016f42f\") " pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.224777 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.224755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnx49\" (UniqueName: \"kubernetes.io/projected/a26938d0-f0cc-41e8-a082-51c312992e57-kube-api-access-gnx49\") pod \"aws-ebs-csi-driver-node-mnznj\" (UID: \"a26938d0-f0cc-41e8-a082-51c312992e57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.238566 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.238539 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8147dca2f1846ffe58ac40c8a9cdfc0b.slice/crio-2af8fb0706da222f9715325727fbfb261c77b15429924a0b1a56db3b4441aca1 WatchSource:0}: Error finding container 2af8fb0706da222f9715325727fbfb261c77b15429924a0b1a56db3b4441aca1: Status 404 returned error can't find the container with id 2af8fb0706da222f9715325727fbfb261c77b15429924a0b1a56db3b4441aca1 Apr 22 18:46:45.239032 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.238991 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b77433e362f2114f13c38a959650d25.slice/crio-5d71152fbb64862ebe36ce45642fb9c25b5e652adb398ad943d01fedb54c6076 WatchSource:0}: Error finding container 5d71152fbb64862ebe36ce45642fb9c25b5e652adb398ad943d01fedb54c6076: Status 404 returned error can't find the container with id 5d71152fbb64862ebe36ce45642fb9c25b5e652adb398ad943d01fedb54c6076 Apr 22 18:46:45.245610 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.245591 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:45.249334 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.249282 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" event={"ID":"8147dca2f1846ffe58ac40c8a9cdfc0b","Type":"ContainerStarted","Data":"2af8fb0706da222f9715325727fbfb261c77b15429924a0b1a56db3b4441aca1"} Apr 22 18:46:45.250265 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.250247 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" event={"ID":"7b77433e362f2114f13c38a959650d25","Type":"ContainerStarted","Data":"5d71152fbb64862ebe36ce45642fb9c25b5e652adb398ad943d01fedb54c6076"} Apr 22 18:46:45.316625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-daemon-config\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-os-release\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-conf-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/188e6686-56b2-4173-8c6e-37c8297781e8-agent-certs\") pod \"konnectivity-agent-kzsjh\" (UID: \"188e6686-56b2-4173-8c6e-37c8297781e8\") " pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316680 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e190ea4b-606c-4dd3-9785-eb0178af92e9-hosts-file\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lcgc\" (UniqueName: \"kubernetes.io/projected/e190ea4b-606c-4dd3-9785-eb0178af92e9-kube-api-access-6lcgc\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316719 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-conf-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316730 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-os-release\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-cni-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e190ea4b-606c-4dd3-9785-eb0178af92e9-hosts-file\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316804 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-netns\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316806 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-cni-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316827 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06ac6726-1f7b-4981-a05c-4538095c85b7-cni-binary-copy\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-k8s-cni-cncf-io\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316860 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/188e6686-56b2-4173-8c6e-37c8297781e8-konnectivity-ca\") pod \"konnectivity-agent-kzsjh\" (UID: \"188e6686-56b2-4173-8c6e-37c8297781e8\") " pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316873 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-netns\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.316870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316881 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-cni-binary-copy\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316936 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-k8s-cni-cncf-io\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.316980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-cnibin\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-system-cni-dir\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lldg7\" (UniqueName: \"kubernetes.io/projected/383fe532-b742-451a-8a94-dc5c7fd3fce5-kube-api-access-lldg7\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrctf\" (UniqueName: \"kubernetes.io/projected/94116840-a7e4-4953-83a7-56e00b343c31-kube-api-access-zrctf\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317102 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-cnibin\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-kubelet\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-cni-bin\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-system-cni-dir\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0484c97f-ca09-4491-bd36-1cd68e364f27-host\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-kubelet\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-os-release\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-socket-dir-parent\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-multus-certs\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-cnibin\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.317555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317309 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjgp\" (UniqueName: \"kubernetes.io/projected/06ac6726-1f7b-4981-a05c-4538095c85b7-kube-api-access-vsjgp\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0484c97f-ca09-4491-bd36-1cd68e364f27-serviceca\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-cni-multus\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-hostroot\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-cnibin\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/06ac6726-1f7b-4981-a05c-4538095c85b7-cni-binary-copy\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317450 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-run-multus-certs\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-socket-dir-parent\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317477 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317485 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/188e6686-56b2-4173-8c6e-37c8297781e8-konnectivity-ca\") pod \"konnectivity-agent-kzsjh\" (UID: \"188e6686-56b2-4173-8c6e-37c8297781e8\") " pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0484c97f-ca09-4491-bd36-1cd68e364f27-host\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317523 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-cni-bin\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317533 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-host-var-lib-cni-multus\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-etc-kubernetes\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-hostroot\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317572 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e190ea4b-606c-4dd3-9785-eb0178af92e9-tmp-dir\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.318236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317622 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-system-cni-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-os-release\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317667 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94116840-a7e4-4953-83a7-56e00b343c31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317677 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9c8\" (UniqueName: \"kubernetes.io/projected/0484c97f-ca09-4491-bd36-1cd68e364f27-kube-api-access-dv9c8\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317696 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-system-cni-dir\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.317753 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317763 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06ac6726-1f7b-4981-a05c-4538095c85b7-etc-kubernetes\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.317818 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:45.817798273 +0000 UTC m=+2.149306724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317842 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0484c97f-ca09-4491-bd36-1cd68e364f27-serviceca\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.317980 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.318006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e190ea4b-606c-4dd3-9785-eb0178af92e9-tmp-dir\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.318391 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-cni-binary-copy\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.318452 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94116840-a7e4-4953-83a7-56e00b343c31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.318724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.318459 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/06ac6726-1f7b-4981-a05c-4538095c85b7-multus-daemon-config\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.319264 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.319248 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/188e6686-56b2-4173-8c6e-37c8297781e8-agent-certs\") pod \"konnectivity-agent-kzsjh\" (UID: \"188e6686-56b2-4173-8c6e-37c8297781e8\") " pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.324414 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.324387 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:45.324414 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.324406 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:45.324414 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.324419 2570 projected.go:194] Error preparing data for projected volume kube-api-access-gqmv8 for pod openshift-network-diagnostics/network-check-target-57n8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:45.324617 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.324483 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8 podName:3dff91e7-4af5-48c7-992c-03ed3b2b6c0b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:45.824465704 +0000 UTC m=+2.155974134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gqmv8" (UniqueName: "kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8") pod "network-check-target-57n8t" (UID: "3dff91e7-4af5-48c7-992c-03ed3b2b6c0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:45.326477 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.326456 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lcgc\" (UniqueName: \"kubernetes.io/projected/e190ea4b-606c-4dd3-9785-eb0178af92e9-kube-api-access-6lcgc\") pod \"node-resolver-cpq2l\" (UID: \"e190ea4b-606c-4dd3-9785-eb0178af92e9\") " pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.326673 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.326651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9c8\" (UniqueName: \"kubernetes.io/projected/0484c97f-ca09-4491-bd36-1cd68e364f27-kube-api-access-dv9c8\") pod \"node-ca-lxsx5\" (UID: \"0484c97f-ca09-4491-bd36-1cd68e364f27\") " pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.326797 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.326780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrctf\" (UniqueName: \"kubernetes.io/projected/94116840-a7e4-4953-83a7-56e00b343c31-kube-api-access-zrctf\") pod \"multus-additional-cni-plugins-c74jc\" (UID: \"94116840-a7e4-4953-83a7-56e00b343c31\") " pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.327317 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.327289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lldg7\" (UniqueName: \"kubernetes.io/projected/383fe532-b742-451a-8a94-dc5c7fd3fce5-kube-api-access-lldg7\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:45.327400 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.327363 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjgp\" (UniqueName: \"kubernetes.io/projected/06ac6726-1f7b-4981-a05c-4538095c85b7-kube-api-access-vsjgp\") pod \"multus-47h8f\" (UID: \"06ac6726-1f7b-4981-a05c-4538095c85b7\") " pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.436555 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.436479 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wcprf" Apr 22 18:46:45.442275 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.442247 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587139e8_f488_4657_8806_34d257b2339c.slice/crio-7a11d9ac5a5f1cd5063447dea0345491f0aad5666dc656579271dddc51ad8ebd WatchSource:0}: Error finding container 7a11d9ac5a5f1cd5063447dea0345491f0aad5666dc656579271dddc51ad8ebd: Status 404 returned error can't find the container with id 7a11d9ac5a5f1cd5063447dea0345491f0aad5666dc656579271dddc51ad8ebd Apr 22 18:46:45.455377 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.455356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:46:45.461007 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.460979 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f7c20eb_284f_4276_b58c_ed5062c1325e.slice/crio-f6fa4c8d43cc79c6694109b703a738a19d6b589493597123e5d20a740540b0c8 WatchSource:0}: Error finding container f6fa4c8d43cc79c6694109b703a738a19d6b589493597123e5d20a740540b0c8: Status 404 returned error can't find the container with id f6fa4c8d43cc79c6694109b703a738a19d6b589493597123e5d20a740540b0c8 Apr 22 18:46:45.465905 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.465886 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" Apr 22 18:46:45.471982 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.471956 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26938d0_f0cc_41e8_a082_51c312992e57.slice/crio-1dbe01916e9888061c1b68bdb59d9a85f396052c50662bd6c0a5428eae655bc1 WatchSource:0}: Error finding container 1dbe01916e9888061c1b68bdb59d9a85f396052c50662bd6c0a5428eae655bc1: Status 404 returned error can't find the container with id 1dbe01916e9888061c1b68bdb59d9a85f396052c50662bd6c0a5428eae655bc1 Apr 22 18:46:45.485032 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.484998 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" Apr 22 18:46:45.489491 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.489470 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cpq2l" Apr 22 18:46:45.491678 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.491658 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814fc44d_7767_426d_8e7f_760f0016f42f.slice/crio-7ce2f8884df6ad718af1ef95e5a34dc235b96b8c14e6a78eba31d3a6450324b8 WatchSource:0}: Error finding container 7ce2f8884df6ad718af1ef95e5a34dc235b96b8c14e6a78eba31d3a6450324b8: Status 404 returned error can't find the container with id 7ce2f8884df6ad718af1ef95e5a34dc235b96b8c14e6a78eba31d3a6450324b8 Apr 22 18:46:45.496402 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.496381 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode190ea4b_606c_4dd3_9785_eb0178af92e9.slice/crio-09ad2e6153a26860d55fe01e9654866c870aecaf98765693ad6fb1986ed695dc WatchSource:0}: Error finding container 09ad2e6153a26860d55fe01e9654866c870aecaf98765693ad6fb1986ed695dc: Status 404 returned error can't find the container with id 09ad2e6153a26860d55fe01e9654866c870aecaf98765693ad6fb1986ed695dc Apr 22 18:46:45.506083 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.506067 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lxsx5" Apr 22 18:46:45.512788 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.512768 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0484c97f_ca09_4491_bd36_1cd68e364f27.slice/crio-59167931d6ec4c252b5506a4f497a20122e606e02db6bfe33d0067c6bdf4a1b9 WatchSource:0}: Error finding container 59167931d6ec4c252b5506a4f497a20122e606e02db6bfe33d0067c6bdf4a1b9: Status 404 returned error can't find the container with id 59167931d6ec4c252b5506a4f497a20122e606e02db6bfe33d0067c6bdf4a1b9 Apr 22 18:46:45.526858 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.526838 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47h8f" Apr 22 18:46:45.532435 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.532409 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06ac6726_1f7b_4981_a05c_4538095c85b7.slice/crio-7cb1eb79d72f7d8079928ace7e7b04f8708e7ebde459787b621323642d772b67 WatchSource:0}: Error finding container 7cb1eb79d72f7d8079928ace7e7b04f8708e7ebde459787b621323642d772b67: Status 404 returned error can't find the container with id 7cb1eb79d72f7d8079928ace7e7b04f8708e7ebde459787b621323642d772b67 Apr 22 18:46:45.532435 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.532421 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:46:45.536308 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.536290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c74jc" Apr 22 18:46:45.540074 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.539948 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188e6686_56b2_4173_8c6e_37c8297781e8.slice/crio-d28c0213fc7fa9ac8fbb8b9dbd7be00424f2c8e6a5d26b9969b27bd059f6c768 WatchSource:0}: Error finding container d28c0213fc7fa9ac8fbb8b9dbd7be00424f2c8e6a5d26b9969b27bd059f6c768: Status 404 returned error can't find the container with id d28c0213fc7fa9ac8fbb8b9dbd7be00424f2c8e6a5d26b9969b27bd059f6c768 Apr 22 18:46:45.543987 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:46:45.543967 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94116840_a7e4_4953_83a7_56e00b343c31.slice/crio-476a7d84a71df1e504818ebeda33f221ed8356b43b2923780d3b33626a930d34 WatchSource:0}: Error finding container 476a7d84a71df1e504818ebeda33f221ed8356b43b2923780d3b33626a930d34: Status 404 returned error can't find the container with id 476a7d84a71df1e504818ebeda33f221ed8356b43b2923780d3b33626a930d34 Apr 22 18:46:45.595772 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.595746 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.821883 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.821258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:45.821883 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.821423 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.821883 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.821488 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.821469759 +0000 UTC m=+3.152978194 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.922566 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:45.922529 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:45.922724 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.922692 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:45.922724 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.922711 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:45.922724 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.922724 2570 projected.go:194] Error preparing data for projected volume kube-api-access-gqmv8 for pod openshift-network-diagnostics/network-check-target-57n8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:45.922903 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:45.922783 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8 podName:3dff91e7-4af5-48c7-992c-03ed3b2b6c0b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.922765616 +0000 UTC m=+3.254274051 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqmv8" (UniqueName: "kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8") pod "network-check-target-57n8t" (UID: "3dff91e7-4af5-48c7-992c-03ed3b2b6c0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.187880 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.187738 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:45 +0000 UTC" deadline="2028-01-27 10:32:06.412353546 +0000 UTC" Apr 22 18:46:46.187880 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.187781 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15471h45m20.224577713s" Apr 22 18:46:46.245820 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.245267 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:46.245820 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:46.245433 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:46.274921 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.274884 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47h8f" event={"ID":"06ac6726-1f7b-4981-a05c-4538095c85b7","Type":"ContainerStarted","Data":"7cb1eb79d72f7d8079928ace7e7b04f8708e7ebde459787b621323642d772b67"} Apr 22 18:46:46.286249 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.286214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"f6fa4c8d43cc79c6694109b703a738a19d6b589493597123e5d20a740540b0c8"} Apr 22 18:46:46.297718 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.297652 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wcprf" event={"ID":"587139e8-f488-4657-8806-34d257b2339c","Type":"ContainerStarted","Data":"7a11d9ac5a5f1cd5063447dea0345491f0aad5666dc656579271dddc51ad8ebd"} Apr 22 18:46:46.315230 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.315177 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerStarted","Data":"476a7d84a71df1e504818ebeda33f221ed8356b43b2923780d3b33626a930d34"} Apr 22 18:46:46.336425 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.336243 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kzsjh" event={"ID":"188e6686-56b2-4173-8c6e-37c8297781e8","Type":"ContainerStarted","Data":"d28c0213fc7fa9ac8fbb8b9dbd7be00424f2c8e6a5d26b9969b27bd059f6c768"} Apr 22 18:46:46.352976 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.352943 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lxsx5" event={"ID":"0484c97f-ca09-4491-bd36-1cd68e364f27","Type":"ContainerStarted","Data":"59167931d6ec4c252b5506a4f497a20122e606e02db6bfe33d0067c6bdf4a1b9"} Apr 22 18:46:46.367528 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.366726 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cpq2l" event={"ID":"e190ea4b-606c-4dd3-9785-eb0178af92e9","Type":"ContainerStarted","Data":"09ad2e6153a26860d55fe01e9654866c870aecaf98765693ad6fb1986ed695dc"} Apr 22 18:46:46.375385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.375341 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" event={"ID":"814fc44d-7767-426d-8e7f-760f0016f42f","Type":"ContainerStarted","Data":"7ce2f8884df6ad718af1ef95e5a34dc235b96b8c14e6a78eba31d3a6450324b8"} Apr 22 18:46:46.386380 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.386339 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" event={"ID":"a26938d0-f0cc-41e8-a082-51c312992e57","Type":"ContainerStarted","Data":"1dbe01916e9888061c1b68bdb59d9a85f396052c50662bd6c0a5428eae655bc1"} Apr 22 18:46:46.447829 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.447525 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:46.501987 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.501890 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:46.830269 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.830187 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:46.830419 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:46.830328 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.830419 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:46.830392 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:48.830373226 +0000 UTC m=+5.161881660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.930724 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:46.930684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:46.930900 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:46.930866 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:46.930900 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:46.930890 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:46.931028 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:46.930904 2570 projected.go:194] Error preparing data for projected volume kube-api-access-gqmv8 for pod openshift-network-diagnostics/network-check-target-57n8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.931028 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:46.930968 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8 podName:3dff91e7-4af5-48c7-992c-03ed3b2b6c0b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:48.930942692 +0000 UTC m=+5.262451123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqmv8" (UniqueName: "kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8") pod "network-check-target-57n8t" (UID: "3dff91e7-4af5-48c7-992c-03ed3b2b6c0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:47.189153 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:47.188141 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:45 +0000 UTC" deadline="2028-01-21 07:33:04.173582082 +0000 UTC" Apr 22 18:46:47.189153 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:47.188215 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15324h46m16.985371763s" Apr 22 18:46:47.243982 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:47.243951 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:47.244158 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:47.244087 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:46:47.246756 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:47.246730 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:48.246601 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:48.246125 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:48.246601 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:48.246259 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:48.844617 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:48.844584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:48.844784 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:48.844756 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:48.844842 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:48.844830 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.844808881 +0000 UTC m=+9.176317317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:48.945221 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:48.945185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:48.945408 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:48.945390 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:48.945471 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:48.945415 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:48.945471 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:48.945428 2570 projected.go:194] Error preparing data for projected volume kube-api-access-gqmv8 for pod openshift-network-diagnostics/network-check-target-57n8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:48.945565 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:48.945484 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8 podName:3dff91e7-4af5-48c7-992c-03ed3b2b6c0b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.945464809 +0000 UTC m=+9.276973244 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqmv8" (UniqueName: "kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8") pod "network-check-target-57n8t" (UID: "3dff91e7-4af5-48c7-992c-03ed3b2b6c0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:49.244210 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:49.243646 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:49.244210 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:49.243779 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:46:50.243747 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:50.243708 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:50.244228 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:50.243862 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:51.243730 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:51.243680 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:51.243928 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:51.243821 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:46:52.245390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:52.245352 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:52.245833 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:52.245548 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:52.877721 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:52.877665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:52.877918 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:52.877862 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:52.877982 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:52.877922 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:00.877905075 +0000 UTC m=+17.209413520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:52.978498 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:52.978460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:52.978691 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:52.978673 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:52.978734 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:52.978698 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:52.978734 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:52.978711 2570 projected.go:194] Error preparing data for projected volume kube-api-access-gqmv8 for pod openshift-network-diagnostics/network-check-target-57n8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:52.978813 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:52.978787 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8 podName:3dff91e7-4af5-48c7-992c-03ed3b2b6c0b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:00.978768328 +0000 UTC m=+17.310276763 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqmv8" (UniqueName: "kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8") pod "network-check-target-57n8t" (UID: "3dff91e7-4af5-48c7-992c-03ed3b2b6c0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:53.243958 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:53.243884 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:53.244126 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:53.244037 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:46:54.244713 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:54.244634 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:54.245143 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:54.244803 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:55.243727 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:55.243702 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:55.243880 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:55.243808 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:46:56.243784 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:56.243744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:56.244296 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:56.243904 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:57.243776 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:57.243738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:57.243946 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:57.243853 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:46:58.244283 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:58.244249 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:46:58.244760 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:58.244384 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:46:59.244325 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:46:59.244281 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:46:59.244789 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:46:59.244422 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:00.243580 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:00.243543 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:00.243767 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:00.243666 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:00.936413 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:00.936371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:00.936921 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:00.936536 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:00.936921 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:00.936612 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:16.936589951 +0000 UTC m=+33.268098394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:01.037055 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:01.036994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:01.037209 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:01.037157 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:01.037209 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:01.037177 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:01.037209 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:01.037187 2570 projected.go:194] Error preparing data for projected volume kube-api-access-gqmv8 for pod openshift-network-diagnostics/network-check-target-57n8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:01.037347 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:01.037244 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8 podName:3dff91e7-4af5-48c7-992c-03ed3b2b6c0b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.037223638 +0000 UTC m=+33.368732070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqmv8" (UniqueName: "kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8") pod "network-check-target-57n8t" (UID: "3dff91e7-4af5-48c7-992c-03ed3b2b6c0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:01.243236 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:01.243160 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:01.243403 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:01.243264 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:02.243351 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:02.243318 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:02.243758 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:02.243434 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:03.243865 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:03.243829 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:03.244210 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:03.243931 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:04.248587 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.247612 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:04.248587 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:04.248205 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:04.422316 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.422288 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47h8f" event={"ID":"06ac6726-1f7b-4981-a05c-4538095c85b7","Type":"ContainerStarted","Data":"b39e8cb8b4cc0f1a75df6c45c345f26f19496706ee4e70893f5c1ccd10abb0d6"} Apr 22 18:47:04.425408 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425391 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 18:47:04.425733 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425710 2570 generic.go:358] "Generic (PLEG): container finished" podID="4f7c20eb-284f-4276-b58c-ed5062c1325e" containerID="0de1c456b9e1f54fda396153f546b416fabcef23ae0ec60e2788d01dfc51d242" exitCode=1 Apr 22 18:47:04.425821 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425783 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"eb313314135d5f5bd7069b63284726e372895a6b55c514cd725886b5e731eed1"} Apr 22 18:47:04.425821 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"a7fad417d117959e7bdaf97bbe9e681ab1f460794ad8f1bd8675ce8148b3c2dc"} Apr 22 18:47:04.425931 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425827 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"8494c80efb2a3f42005669e19755661b178f09d7cd04592d50bb6a592c7fa434"} Apr 22 18:47:04.425931 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425839 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"04a1dbb4548d7db0c1138c953ab7141752871832999b1c1f0986de50464b1838"} Apr 22 18:47:04.425931 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425852 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerDied","Data":"0de1c456b9e1f54fda396153f546b416fabcef23ae0ec60e2788d01dfc51d242"} Apr 22 18:47:04.425931 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.425866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"bb0d6e3589d0fe8e86aeeaef4166a32e92de004f8b8ce9c0f4f4f210a3359c68"} Apr 22 18:47:04.427431 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.427270 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" event={"ID":"814fc44d-7767-426d-8e7f-760f0016f42f","Type":"ContainerStarted","Data":"f9bbf3e90fc49cf68969c74ab3e64214f357237d8a830da61a71d1fc0a575909"} Apr 22 18:47:04.428648 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.428629 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" event={"ID":"8147dca2f1846ffe58ac40c8a9cdfc0b","Type":"ContainerStarted","Data":"382a7280bd34ed86803fa8ff3f5faa2b6a46ad7ea6b9b947308f6b74d06a08a4"} Apr 22 18:47:04.442182 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.442124 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-47h8f" podStartSLOduration=2.502370557 podStartE2EDuration="20.4421106s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.533915374 +0000 UTC m=+1.865423805" lastFinishedPulling="2026-04-22 18:47:03.473655402 +0000 UTC m=+19.805163848" observedRunningTime="2026-04-22 18:47:04.440699343 +0000 UTC m=+20.772207798" watchObservedRunningTime="2026-04-22 18:47:04.4421106 +0000 UTC m=+20.773619052" Apr 22 18:47:04.470998 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.470951 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" podStartSLOduration=19.470932537 podStartE2EDuration="19.470932537s" podCreationTimestamp="2026-04-22 18:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:47:04.454306102 +0000 UTC m=+20.785814554" watchObservedRunningTime="2026-04-22 18:47:04.470932537 +0000 UTC m=+20.802441001" Apr 22 18:47:04.471175 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:04.471144 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7rzxj" podStartSLOduration=2.521147843 podStartE2EDuration="20.471131043s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.493240069 +0000 UTC m=+1.824748499" lastFinishedPulling="2026-04-22 18:47:03.443223256 +0000 UTC m=+19.774731699" observedRunningTime="2026-04-22 18:47:04.470701665 +0000 UTC m=+20.802210117" watchObservedRunningTime="2026-04-22 18:47:04.471131043 +0000 UTC m=+20.802639495" Apr 22 18:47:05.243631 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.243455 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:05.243737 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:05.243716 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:05.357835 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.357811 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:47:05.431212 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.431171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wcprf" event={"ID":"587139e8-f488-4657-8806-34d257b2339c","Type":"ContainerStarted","Data":"d9c88ca4e61bfbe542fffb0a73c5a4d9a5602693cf42dd58f6664880fa051301"} Apr 22 18:47:05.432456 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.432430 2570 generic.go:358] "Generic (PLEG): container finished" podID="94116840-a7e4-4953-83a7-56e00b343c31" containerID="60e447ef08bc53a620f123f343b0083e147733b611a6fec68bc59fe9fecc0b39" exitCode=0 Apr 22 18:47:05.432562 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.432497 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerDied","Data":"60e447ef08bc53a620f123f343b0083e147733b611a6fec68bc59fe9fecc0b39"} Apr 22 18:47:05.433780 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.433739 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kzsjh" event={"ID":"188e6686-56b2-4173-8c6e-37c8297781e8","Type":"ContainerStarted","Data":"fa44eb2d8c7fbad2755156e3eb37a4bd086464b821a33d6c4d8d2e6788f11a06"} Apr 22 18:47:05.434998 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.434979 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lxsx5" event={"ID":"0484c97f-ca09-4491-bd36-1cd68e364f27","Type":"ContainerStarted","Data":"c8fee1a111bf0d4cf3f87bdf3d30ce0b35aaa84dc8b5f982737622054128da50"} Apr 22 18:47:05.436263 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.436238 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cpq2l" event={"ID":"e190ea4b-606c-4dd3-9785-eb0178af92e9","Type":"ContainerStarted","Data":"deee64b71f7e59618d0941eaf56cb723b25dcaea4e6832e511f395330784545e"} Apr 22 18:47:05.437622 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.437604 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" event={"ID":"a26938d0-f0cc-41e8-a082-51c312992e57","Type":"ContainerStarted","Data":"e75fa20936c777172955290987c74faaba369756b3b5e0442aef0005aeb8cf87"} Apr 22 18:47:05.437691 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.437626 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" event={"ID":"a26938d0-f0cc-41e8-a082-51c312992e57","Type":"ContainerStarted","Data":"96aeccf4a341107f95a5c282f4e604599c5536b2e7caf981df303b07a19ecb5f"} Apr 22 18:47:05.438866 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.438845 2570 generic.go:358] "Generic (PLEG): container finished" podID="7b77433e362f2114f13c38a959650d25" containerID="a875448dc788131e7e2c4dbbca05abbf767b8e195cd7e62c37ec4247f81073c1" exitCode=0 Apr 22 18:47:05.438956 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.438925 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" event={"ID":"7b77433e362f2114f13c38a959650d25","Type":"ContainerDied","Data":"a875448dc788131e7e2c4dbbca05abbf767b8e195cd7e62c37ec4247f81073c1"} Apr 22 18:47:05.439063 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.439049 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 22 18:47:05.446310 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.446277 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wcprf" podStartSLOduration=3.478060735 podStartE2EDuration="21.446263886s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.443619582 +0000 UTC m=+1.775128012" lastFinishedPulling="2026-04-22 18:47:03.41182272 +0000 UTC m=+19.743331163" observedRunningTime="2026-04-22 18:47:05.446224752 +0000 UTC m=+21.777733204" watchObservedRunningTime="2026-04-22 18:47:05.446263886 +0000 UTC m=+21.777772338" Apr 22 18:47:05.456823 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.456804 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:47:05.457306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.457291 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal"] Apr 22 18:47:05.462268 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.462230 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kzsjh" podStartSLOduration=7.419203591 podStartE2EDuration="21.462219791s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.541610008 +0000 UTC m=+1.873118438" lastFinishedPulling="2026-04-22 18:46:59.584626205 +0000 UTC m=+15.916134638" observedRunningTime="2026-04-22 18:47:05.461825657 +0000 UTC m=+21.793334113" watchObservedRunningTime="2026-04-22 18:47:05.462219791 +0000 UTC m=+21.793728243" Apr 22 18:47:05.476598 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.476554 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cpq2l" podStartSLOduration=3.562384535 podStartE2EDuration="21.476538409s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.497760833 +0000 UTC m=+1.829269263" lastFinishedPulling="2026-04-22 18:47:03.411914678 +0000 UTC m=+19.743423137" observedRunningTime="2026-04-22 18:47:05.47573393 +0000 UTC m=+21.807242383" watchObservedRunningTime="2026-04-22 18:47:05.476538409 +0000 UTC m=+21.808046862" Apr 22 18:47:05.509277 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:05.509058 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lxsx5" podStartSLOduration=11.786809667 podStartE2EDuration="21.509040552s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.514160316 +0000 UTC m=+1.845668751" lastFinishedPulling="2026-04-22 18:46:55.236391194 +0000 UTC m=+11.567899636" observedRunningTime="2026-04-22 18:47:05.508873516 +0000 UTC m=+21.840381974" watchObservedRunningTime="2026-04-22 18:47:05.509040552 +0000 UTC m=+21.840549007" Apr 22 18:47:06.189438 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.189322 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:47:05.357831544Z","UUID":"2f2a6346-b15d-4e3d-a34c-28e52ab92181","Handler":null,"Name":"","Endpoint":""} Apr 22 18:47:06.191685 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.191650 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:47:06.191685 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.191677 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:47:06.243704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.243679 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:06.243826 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:06.243805 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:06.443126 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.443036 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" event={"ID":"a26938d0-f0cc-41e8-a082-51c312992e57","Type":"ContainerStarted","Data":"6a3417046de2975aecf21c39db070378403bbb2c4e968aa07b51c1647617597a"} Apr 22 18:47:06.445060 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.444993 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" event={"ID":"7b77433e362f2114f13c38a959650d25","Type":"ContainerStarted","Data":"55694f5bc37841b0b7699d6dbee049e27fcf23ddaba91f21c54b3c8307bccb5b"} Apr 22 18:47:06.447996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.447975 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 18:47:06.448419 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.448395 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"5a2210efc3b1561187b02cbba4aa682dcb91fc3b6b4e26189f0fa481a6410bf0"} Apr 22 18:47:06.475677 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.475634 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mnznj" podStartSLOduration=1.828776168 podStartE2EDuration="22.475621603s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.473137996 +0000 UTC m=+1.804646425" lastFinishedPulling="2026-04-22 18:47:06.119983419 +0000 UTC m=+22.451491860" observedRunningTime="2026-04-22 18:47:06.475221507 +0000 UTC m=+22.806729962" watchObservedRunningTime="2026-04-22 18:47:06.475621603 +0000 UTC m=+22.807130054" Apr 22 18:47:06.490500 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:06.490460 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" podStartSLOduration=1.490444372 podStartE2EDuration="1.490444372s" podCreationTimestamp="2026-04-22 18:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:47:06.49000255 +0000 UTC m=+22.821511004" watchObservedRunningTime="2026-04-22 18:47:06.490444372 +0000 UTC m=+22.821952825" Apr 22 18:47:07.243244 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:07.243215 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:07.243455 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:07.243338 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:07.595432 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:07.595360 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:47:07.596062 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:07.596027 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:47:08.246434 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:08.246405 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:08.246613 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:08.246544 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:08.452578 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:08.452535 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:47:08.453165 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:08.453141 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kzsjh" Apr 22 18:47:09.243786 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:09.243756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:09.244236 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:09.243872 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:10.246177 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.245993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:10.246919 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:10.246262 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:10.458064 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.458036 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 18:47:10.458413 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.458385 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"0d02fc9255803824e29897906602ef31c7fd391287e4209c162188ff6a60d38f"} Apr 22 18:47:10.458632 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.458619 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:47:10.458864 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.458836 2570 scope.go:117] "RemoveContainer" containerID="0de1c456b9e1f54fda396153f546b416fabcef23ae0ec60e2788d01dfc51d242" Apr 22 18:47:10.460159 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.460132 2570 generic.go:358] "Generic (PLEG): container finished" podID="94116840-a7e4-4953-83a7-56e00b343c31" containerID="5df6a78b8605f62d90e656b7ceb299f4ed9378db13264bf583245e90aff8ced1" exitCode=0 Apr 22 18:47:10.460255 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.460208 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerDied","Data":"5df6a78b8605f62d90e656b7ceb299f4ed9378db13264bf583245e90aff8ced1"} Apr 22 18:47:10.474436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:10.474415 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:47:11.243774 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.243631 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:11.243877 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:11.243858 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:11.436974 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.436943 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57n8t"] Apr 22 18:47:11.439712 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.439687 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mf94f"] Apr 22 18:47:11.439823 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.439800 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:11.439905 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:11.439882 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:11.464728 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.464696 2570 generic.go:358] "Generic (PLEG): container finished" podID="94116840-a7e4-4953-83a7-56e00b343c31" containerID="81230753a045abd67381b22bfb085f88eaf28223a7ebd052557f833563d6c679" exitCode=0 Apr 22 18:47:11.464876 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.464788 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerDied","Data":"81230753a045abd67381b22bfb085f88eaf28223a7ebd052557f833563d6c679"} Apr 22 18:47:11.468673 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.468641 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 18:47:11.469032 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.468987 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" event={"ID":"4f7c20eb-284f-4276-b58c-ed5062c1325e","Type":"ContainerStarted","Data":"cfe4282a968c746f2608fb6f1f57a8e26a9a45ccf38a6cdc5e6c61e3af0870d1"} Apr 22 18:47:11.469032 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.469028 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:11.469165 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.469123 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:11.469165 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:11.469131 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:11.469293 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.469280 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:47:11.485570 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.485547 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:47:11.517213 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:11.517169 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" podStartSLOduration=9.155320763 podStartE2EDuration="27.517155986s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.462488918 +0000 UTC m=+1.793997348" lastFinishedPulling="2026-04-22 18:47:03.824324127 +0000 UTC m=+20.155832571" observedRunningTime="2026-04-22 18:47:11.516678048 +0000 UTC m=+27.848186511" watchObservedRunningTime="2026-04-22 18:47:11.517155986 +0000 UTC m=+27.848664437" Apr 22 18:47:12.473524 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:12.473496 2570 generic.go:358] "Generic (PLEG): container finished" podID="94116840-a7e4-4953-83a7-56e00b343c31" containerID="5754fad5c576c60584380083c5e25e8f71b00601edd2deba09dddea70f91809d" exitCode=0 Apr 22 18:47:12.473875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:12.473583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerDied","Data":"5754fad5c576c60584380083c5e25e8f71b00601edd2deba09dddea70f91809d"} Apr 22 18:47:12.473875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:12.473801 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:13.244317 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:13.244242 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:13.244484 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:13.244254 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:13.244484 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:13.244365 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:13.244484 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:13.244432 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:13.475850 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:13.475822 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:13.820162 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:13.820126 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:47:14.490262 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:14.490237 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tlrd2" Apr 22 18:47:15.243428 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.243393 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:15.243428 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.243425 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:15.243673 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:15.243531 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:47:15.243673 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:15.243621 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57n8t" podUID="3dff91e7-4af5-48c7-992c-03ed3b2b6c0b" Apr 22 18:47:15.467063 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.467035 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeReady" Apr 22 18:47:15.467198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.467190 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:15.509445 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.509371 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6zgnq"] Apr 22 18:47:15.553860 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.553833 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n4k5q"] Apr 22 18:47:15.554027 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.553979 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.556366 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.556346 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w2ls6\"" Apr 22 18:47:15.556366 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.556353 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:15.556366 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.556367 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:15.567790 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.567772 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6zgnq"] Apr 22 18:47:15.567790 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.567792 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n4k5q"] Apr 22 18:47:15.567922 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.567872 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:15.570576 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.570357 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:15.570576 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.570430 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q94s6\"" Apr 22 18:47:15.570576 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.570454 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:15.570750 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.570585 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:15.653711 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.653674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.653711 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.653722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:15.653970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.653806 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7rw\" (UniqueName: \"kubernetes.io/projected/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-kube-api-access-gg7rw\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:15.653970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.653866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/030083d5-b0a7-4438-a7f6-06eae3c80777-tmp-dir\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.653970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.653917 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8x8l\" (UniqueName: \"kubernetes.io/projected/030083d5-b0a7-4438-a7f6-06eae3c80777-kube-api-access-w8x8l\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.653970 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.653952 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/030083d5-b0a7-4438-a7f6-06eae3c80777-config-volume\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.754789 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.754610 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.754789 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.754795 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:15.754978 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.754821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7rw\" (UniqueName: \"kubernetes.io/projected/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-kube-api-access-gg7rw\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:15.754978 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.754849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/030083d5-b0a7-4438-a7f6-06eae3c80777-tmp-dir\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.754978 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.754877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8x8l\" (UniqueName: \"kubernetes.io/projected/030083d5-b0a7-4438-a7f6-06eae3c80777-kube-api-access-w8x8l\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.754978 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:15.754763 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:15.754978 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.754896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/030083d5-b0a7-4438-a7f6-06eae3c80777-config-volume\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.755195 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:15.754996 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:16.254973242 +0000 UTC m=+32.586481688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:47:15.755195 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:15.754892 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:15.755195 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:15.755049 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:16.255039738 +0000 UTC m=+32.586548175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:47:15.755303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.755205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/030083d5-b0a7-4438-a7f6-06eae3c80777-tmp-dir\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.755395 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.755380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/030083d5-b0a7-4438-a7f6-06eae3c80777-config-volume\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.765677 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.765626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8x8l\" (UniqueName: \"kubernetes.io/projected/030083d5-b0a7-4438-a7f6-06eae3c80777-kube-api-access-w8x8l\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:15.765791 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:15.765721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7rw\" (UniqueName: \"kubernetes.io/projected/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-kube-api-access-gg7rw\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:16.259050 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:16.258992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:16.259299 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:16.259088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:16.259299 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:16.259163 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:16.259299 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:16.259239 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:16.259299 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:16.259247 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.259225311 +0000 UTC m=+33.590733756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:47:16.259493 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:16.259307 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.259290357 +0000 UTC m=+33.590798792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:47:16.964921 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:16.964881 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:16.965350 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:16.965078 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:16.965350 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:16.965165 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:48.965142731 +0000 UTC m=+65.296651173 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:17.065648 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.065611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:17.065827 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.065793 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:17.065827 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.065820 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:17.065929 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.065835 2570 projected.go:194] Error preparing data for projected volume kube-api-access-gqmv8 for pod openshift-network-diagnostics/network-check-target-57n8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:17.065929 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.065899 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8 podName:3dff91e7-4af5-48c7-992c-03ed3b2b6c0b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:49.065883757 +0000 UTC m=+65.397392187 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gqmv8" (UniqueName: "kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8") pod "network-check-target-57n8t" (UID: "3dff91e7-4af5-48c7-992c-03ed3b2b6c0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:17.243966 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.243885 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:17.244191 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.243889 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:17.247071 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.247042 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:17.249046 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.248667 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:17.249046 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.248955 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c77rp\"" Apr 22 18:47:17.249222 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.249175 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:17.249276 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.249267 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qpsz\"" Apr 22 18:47:17.266955 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.266933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:17.267089 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:17.266970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:17.267155 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.267106 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:17.267155 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.267123 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:17.267255 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.267178 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:19.267156598 +0000 UTC m=+35.598665031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:47:17.267255 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:17.267200 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:19.267188941 +0000 UTC m=+35.598697377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:47:18.487332 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:18.487303 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerStarted","Data":"6e4407ba4633d8d5a395ba6fcee3035ab9070caff31c05cc85d09f41d9ecf12d"} Apr 22 18:47:19.280487 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:19.280453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:19.280487 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:19.280488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:19.280744 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:19.280606 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:19.280744 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:19.280614 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:19.280744 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:19.280660 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:23.280645444 +0000 UTC m=+39.612153874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:47:19.280744 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:19.280672 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:23.280666691 +0000 UTC m=+39.612175121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:47:19.491418 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:19.491384 2570 generic.go:358] "Generic (PLEG): container finished" podID="94116840-a7e4-4953-83a7-56e00b343c31" containerID="6e4407ba4633d8d5a395ba6fcee3035ab9070caff31c05cc85d09f41d9ecf12d" exitCode=0 Apr 22 18:47:19.491819 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:19.491466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerDied","Data":"6e4407ba4633d8d5a395ba6fcee3035ab9070caff31c05cc85d09f41d9ecf12d"} Apr 22 18:47:20.496306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:20.496271 2570 generic.go:358] "Generic (PLEG): container finished" podID="94116840-a7e4-4953-83a7-56e00b343c31" containerID="a4e61bccb39d4236c3a6f918e398b6b7f5ae5621e2f82c9dbfe31d03ad075b9a" exitCode=0 Apr 22 18:47:20.496788 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:20.496346 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerDied","Data":"a4e61bccb39d4236c3a6f918e398b6b7f5ae5621e2f82c9dbfe31d03ad075b9a"} Apr 22 18:47:21.501333 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:21.501300 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c74jc" event={"ID":"94116840-a7e4-4953-83a7-56e00b343c31","Type":"ContainerStarted","Data":"3675da563f90b1380aac2ae3e3f7a434240af6141d19d8928ca103feadb8b061"} Apr 22 18:47:21.545795 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:21.545741 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c74jc" podStartSLOduration=4.7646584149999995 podStartE2EDuration="37.545724324s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.545435297 +0000 UTC m=+1.876943726" lastFinishedPulling="2026-04-22 18:47:18.326501201 +0000 UTC m=+34.658009635" observedRunningTime="2026-04-22 18:47:21.545685748 +0000 UTC m=+37.877194200" watchObservedRunningTime="2026-04-22 18:47:21.545724324 +0000 UTC m=+37.877232771" Apr 22 18:47:23.311997 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:23.311961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:23.311997 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:23.311997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:23.312473 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:23.312109 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:23.312473 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:23.312123 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:23.312473 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:23.312163 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:31.312148906 +0000 UTC m=+47.643657335 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:47:23.312473 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:23.312188 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:31.312168765 +0000 UTC m=+47.643677200 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:47:31.368322 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:31.368275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:31.368322 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:31.368327 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:31.368859 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:31.368418 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:31.368859 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:31.368481 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:31.368859 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:31.368484 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:47.368468199 +0000 UTC m=+63.699976633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:47:31.368859 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:31.368542 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:47.368526235 +0000 UTC m=+63.700034675 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:47:47.371845 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:47.371813 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:47:47.371845 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:47.371847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:47:47.372345 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:47.371955 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:47.372345 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:47.371958 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:47.372345 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:47.372002 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:19.371989775 +0000 UTC m=+95.703498206 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:47:47.372345 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:47.372032 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:19.372008685 +0000 UTC m=+95.703517114 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:47:48.983782 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:48.983743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:47:48.986435 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:48.986420 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:48.994612 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:48.994597 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:48.994665 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:47:48.994655 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:52.994635628 +0000 UTC m=+129.326144062 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : secret "metrics-daemon-secret" not found Apr 22 18:47:49.084533 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:49.084504 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:49.087526 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:49.087503 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:49.097125 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:49.097105 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:49.107720 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:49.107696 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqmv8\" (UniqueName: \"kubernetes.io/projected/3dff91e7-4af5-48c7-992c-03ed3b2b6c0b-kube-api-access-gqmv8\") pod \"network-check-target-57n8t\" (UID: \"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b\") " pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:49.361252 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:49.361226 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c77rp\"" Apr 22 18:47:49.369436 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:49.369412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:49.542963 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:49.542909 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57n8t"] Apr 22 18:47:49.547501 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:47:49.547470 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dff91e7_4af5_48c7_992c_03ed3b2b6c0b.slice/crio-82c8d0b872f90a0e08e89bf0e1c83fe607cc74ed3b671233ff3cf1efd6a4b8be WatchSource:0}: Error finding container 82c8d0b872f90a0e08e89bf0e1c83fe607cc74ed3b671233ff3cf1efd6a4b8be: Status 404 returned error can't find the container with id 82c8d0b872f90a0e08e89bf0e1c83fe607cc74ed3b671233ff3cf1efd6a4b8be Apr 22 18:47:50.554942 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:50.554888 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57n8t" event={"ID":"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b","Type":"ContainerStarted","Data":"82c8d0b872f90a0e08e89bf0e1c83fe607cc74ed3b671233ff3cf1efd6a4b8be"} Apr 22 18:47:52.561153 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:52.560993 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57n8t" event={"ID":"3dff91e7-4af5-48c7-992c-03ed3b2b6c0b","Type":"ContainerStarted","Data":"5657b01e0c81a8eed41e0b04e6371ca24ac142e3c36d1588f8f511adace96fb5"} Apr 22 18:47:52.561519 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:52.561238 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:47:52.577548 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:47:52.577505 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-57n8t" podStartSLOduration=65.986506566 podStartE2EDuration="1m8.577492924s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:47:49.553772579 +0000 UTC m=+65.885281008" lastFinishedPulling="2026-04-22 18:47:52.144758922 +0000 UTC m=+68.476267366" observedRunningTime="2026-04-22 18:47:52.576916669 +0000 UTC m=+68.908425117" watchObservedRunningTime="2026-04-22 18:47:52.577492924 +0000 UTC m=+68.909001411" Apr 22 18:48:19.396101 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:19.396047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:48:19.396101 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:19.396107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:48:19.396640 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:19.396218 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:19.396640 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:19.396251 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:19.396640 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:19.396297 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert podName:b72ffd58-b0b5-48d0-b56c-8d4e7f30307b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:23.396279165 +0000 UTC m=+159.727787614 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert") pod "ingress-canary-n4k5q" (UID: "b72ffd58-b0b5-48d0-b56c-8d4e7f30307b") : secret "canary-serving-cert" not found Apr 22 18:48:19.396640 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:19.396316 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls podName:030083d5-b0a7-4438-a7f6-06eae3c80777 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:23.39630693 +0000 UTC m=+159.727815360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls") pod "dns-default-6zgnq" (UID: "030083d5-b0a7-4438-a7f6-06eae3c80777") : secret "dns-default-metrics-tls" not found Apr 22 18:48:23.565599 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:23.565569 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-57n8t" Apr 22 18:48:53.030698 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:53.030644 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:48:53.031219 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:53.030791 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:48:53.031219 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:53.030868 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs podName:383fe532-b742-451a-8a94-dc5c7fd3fce5 nodeName:}" failed. No retries permitted until 2026-04-22 18:50:55.030850949 +0000 UTC m=+251.362359378 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs") pod "network-metrics-daemon-mf94f" (UID: "383fe532-b742-451a-8a94-dc5c7fd3fce5") : secret "metrics-daemon-secret" not found Apr 22 18:48:57.143306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.143273 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5"] Apr 22 18:48:57.145931 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.145912 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.148596 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.148578 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:48:57.149819 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.149803 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.149894 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.149868 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4v4zb\"" Apr 22 18:48:57.152292 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.152274 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.152978 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.152963 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:48:57.157722 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.157704 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5"] Apr 22 18:48:57.243704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.243680 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9"] Apr 22 18:48:57.246486 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.246439 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" Apr 22 18:48:57.247693 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.247674 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nzr8k"] Apr 22 18:48:57.249275 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.249250 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-pfncf\"" Apr 22 18:48:57.250088 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.250074 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-664bcd8945-zfwms"] Apr 22 18:48:57.250224 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.250210 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.252504 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.252488 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.253780 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.253624 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9bkdk\"" Apr 22 18:48:57.253853 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.253826 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.254124 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.254105 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:48:57.254213 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.254143 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:48:57.254366 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.254351 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.255378 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.255363 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:48:57.255663 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.255648 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:48:57.255740 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.255671 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:48:57.257307 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.257286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59hj\" (UniqueName: \"kubernetes.io/projected/82e68df6-e973-4eb4-9f72-57676895ca9b-kube-api-access-h59hj\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.257412 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.257332 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.257412 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.257368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/82e68df6-e973-4eb4-9f72-57676895ca9b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.257687 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.257672 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9"] Apr 22 18:48:57.258151 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.258132 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vkfzv\"" Apr 22 18:48:57.263580 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.263560 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:48:57.264666 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.264644 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:48:57.265429 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.265409 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nzr8k"] Apr 22 18:48:57.268483 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.268464 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-664bcd8945-zfwms"] Apr 22 18:48:57.350163 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.350133 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc"] Apr 22 18:48:57.353003 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.352988 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.355652 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.355629 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:48:57.355652 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.355651 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-jc5qv\"" Apr 22 18:48:57.355831 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.355659 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:48:57.355831 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.355680 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:57.355957 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.355942 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:57.357743 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.357723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/82e68df6-e973-4eb4-9f72-57676895ca9b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.357843 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.357766 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7413bf65-85db-4595-b685-fede8886d53f-ca-trust-extracted\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.357843 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.357794 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e711ad-d134-4b34-9606-2a2cb1b1f283-serving-cert\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.357843 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.357811 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e711ad-d134-4b34-9606-2a2cb1b1f283-trusted-ca\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.358047 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.357840 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-installation-pull-secrets\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.358047 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.357926 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8fpl\" (UniqueName: \"kubernetes.io/projected/26e711ad-d134-4b34-9606-2a2cb1b1f283-kube-api-access-v8fpl\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.358047 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.357972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h59hj\" (UniqueName: \"kubernetes.io/projected/82e68df6-e973-4eb4-9f72-57676895ca9b-kube-api-access-h59hj\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.358047 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e711ad-d134-4b34-9606-2a2cb1b1f283-config\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.358047 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-bound-sa-token\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.358303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358077 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfsrl\" (UniqueName: \"kubernetes.io/projected/e6379d65-65a4-43ce-90b3-22b4af6360dc-kube-api-access-cfsrl\") pod \"network-check-source-8894fc9bd-98fl9\" (UID: \"e6379d65-65a4-43ce-90b3-22b4af6360dc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" Apr 22 18:48:57.358303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-image-registry-private-configuration\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.358303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358126 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-trusted-ca\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.358303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358149 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl77g\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-kube-api-access-gl77g\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.358303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358199 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-registry-certificates\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.358303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358233 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.358303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358261 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.358639 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.358337 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:57.358639 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.358400 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls podName:82e68df6-e973-4eb4-9f72-57676895ca9b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:57.8583828 +0000 UTC m=+134.189891230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qhml5" (UID: "82e68df6-e973-4eb4-9f72-57676895ca9b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:57.358639 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.358536 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/82e68df6-e973-4eb4-9f72-57676895ca9b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.361243 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.361215 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc"] Apr 22 18:48:57.371799 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.371778 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59hj\" (UniqueName: \"kubernetes.io/projected/82e68df6-e973-4eb4-9f72-57676895ca9b-kube-api-access-h59hj\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.458884 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.458821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.458884 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.458865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e711ad-d134-4b34-9606-2a2cb1b1f283-config\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.458884 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.458882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-bound-sa-token\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459078 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.458920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4cq7\" (UniqueName: \"kubernetes.io/projected/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-kube-api-access-v4cq7\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.459078 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfsrl\" (UniqueName: \"kubernetes.io/projected/e6379d65-65a4-43ce-90b3-22b4af6360dc-kube-api-access-cfsrl\") pod \"network-check-source-8894fc9bd-98fl9\" (UID: \"e6379d65-65a4-43ce-90b3-22b4af6360dc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" Apr 22 18:48:57.459142 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459075 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-image-registry-private-configuration\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459142 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-trusted-ca\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459142 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl77g\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-kube-api-access-gl77g\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459288 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-registry-certificates\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459288 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459288 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459250 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.459431 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.459321 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:57.459431 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.459339 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-664bcd8945-zfwms: secret "image-registry-tls" not found Apr 22 18:48:57.459431 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459362 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7413bf65-85db-4595-b685-fede8886d53f-ca-trust-extracted\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459431 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.459404 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls podName:7413bf65-85db-4595-b685-fede8886d53f nodeName:}" failed. No retries permitted until 2026-04-22 18:48:57.959384676 +0000 UTC m=+134.290893112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls") pod "image-registry-664bcd8945-zfwms" (UID: "7413bf65-85db-4595-b685-fede8886d53f") : secret "image-registry-tls" not found Apr 22 18:48:57.459621 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e711ad-d134-4b34-9606-2a2cb1b1f283-serving-cert\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.459621 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e711ad-d134-4b34-9606-2a2cb1b1f283-trusted-ca\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.459621 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459514 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-installation-pull-secrets\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.459621 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8fpl\" (UniqueName: \"kubernetes.io/projected/26e711ad-d134-4b34-9606-2a2cb1b1f283-kube-api-access-v8fpl\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.459812 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459619 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e711ad-d134-4b34-9606-2a2cb1b1f283-config\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.459874 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.459838 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-registry-certificates\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.460148 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.460113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7413bf65-85db-4595-b685-fede8886d53f-ca-trust-extracted\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.460258 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.460243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-trusted-ca\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.460985 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.460960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26e711ad-d134-4b34-9606-2a2cb1b1f283-trusted-ca\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.461539 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.461521 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-image-registry-private-configuration\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.461916 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.461901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e711ad-d134-4b34-9606-2a2cb1b1f283-serving-cert\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.462185 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.462169 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-installation-pull-secrets\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.468534 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.468508 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-bound-sa-token\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.468730 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.468710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfsrl\" (UniqueName: \"kubernetes.io/projected/e6379d65-65a4-43ce-90b3-22b4af6360dc-kube-api-access-cfsrl\") pod \"network-check-source-8894fc9bd-98fl9\" (UID: \"e6379d65-65a4-43ce-90b3-22b4af6360dc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" Apr 22 18:48:57.468774 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.468721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl77g\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-kube-api-access-gl77g\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.468774 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.468741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8fpl\" (UniqueName: \"kubernetes.io/projected/26e711ad-d134-4b34-9606-2a2cb1b1f283-kube-api-access-v8fpl\") pod \"console-operator-9d4b6777b-nzr8k\" (UID: \"26e711ad-d134-4b34-9606-2a2cb1b1f283\") " pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.556855 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.556812 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" Apr 22 18:48:57.560703 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.560684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.560799 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.560729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4cq7\" (UniqueName: \"kubernetes.io/projected/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-kube-api-access-v4cq7\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.560857 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.560800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.561291 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.561267 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.562790 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.562762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.564834 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.564813 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:48:57.570291 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.570271 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4cq7\" (UniqueName: \"kubernetes.io/projected/fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3-kube-api-access-v4cq7\") pod \"kube-storage-version-migrator-operator-6769c5d45-bd5pc\" (UID: \"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.662401 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.662376 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" Apr 22 18:48:57.672226 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.672198 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9"] Apr 22 18:48:57.675545 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:48:57.675514 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6379d65_65a4_43ce_90b3_22b4af6360dc.slice/crio-23a97d309be4ddd69e8191bed50638907782b04ee4da99953d32769dc1a08472 WatchSource:0}: Error finding container 23a97d309be4ddd69e8191bed50638907782b04ee4da99953d32769dc1a08472: Status 404 returned error can't find the container with id 23a97d309be4ddd69e8191bed50638907782b04ee4da99953d32769dc1a08472 Apr 22 18:48:57.679666 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.679639 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" event={"ID":"e6379d65-65a4-43ce-90b3-22b4af6360dc","Type":"ContainerStarted","Data":"23a97d309be4ddd69e8191bed50638907782b04ee4da99953d32769dc1a08472"} Apr 22 18:48:57.688118 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.688086 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nzr8k"] Apr 22 18:48:57.691414 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:48:57.691373 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e711ad_d134_4b34_9606_2a2cb1b1f283.slice/crio-7911c35cc10b9fd1c8eafc7d39860fe2fa22d51fa42afdd1bb0cc3338dfdeb08 WatchSource:0}: Error finding container 7911c35cc10b9fd1c8eafc7d39860fe2fa22d51fa42afdd1bb0cc3338dfdeb08: Status 404 returned error can't find the container with id 7911c35cc10b9fd1c8eafc7d39860fe2fa22d51fa42afdd1bb0cc3338dfdeb08 Apr 22 18:48:57.781393 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.781353 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc"] Apr 22 18:48:57.785303 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:48:57.785276 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe75fb92_d6fe_47f7_96e0_d03ff3a8acc3.slice/crio-da535a93382daccd2fce114e074c909f5fa3ba77df583812339b1f9d78e51a54 WatchSource:0}: Error finding container da535a93382daccd2fce114e074c909f5fa3ba77df583812339b1f9d78e51a54: Status 404 returned error can't find the container with id da535a93382daccd2fce114e074c909f5fa3ba77df583812339b1f9d78e51a54 Apr 22 18:48:57.863095 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.863061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:57.863254 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.863203 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:57.863302 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.863264 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls podName:82e68df6-e973-4eb4-9f72-57676895ca9b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:58.86324713 +0000 UTC m=+135.194755560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qhml5" (UID: "82e68df6-e973-4eb4-9f72-57676895ca9b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:57.963638 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:57.963560 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:57.963782 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.963726 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:57.963782 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.963749 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-664bcd8945-zfwms: secret "image-registry-tls" not found Apr 22 18:48:57.963874 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:57.963813 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls podName:7413bf65-85db-4595-b685-fede8886d53f nodeName:}" failed. No retries permitted until 2026-04-22 18:48:58.963792389 +0000 UTC m=+135.295300825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls") pod "image-registry-664bcd8945-zfwms" (UID: "7413bf65-85db-4595-b685-fede8886d53f") : secret "image-registry-tls" not found Apr 22 18:48:58.683275 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:58.683234 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" event={"ID":"26e711ad-d134-4b34-9606-2a2cb1b1f283","Type":"ContainerStarted","Data":"7911c35cc10b9fd1c8eafc7d39860fe2fa22d51fa42afdd1bb0cc3338dfdeb08"} Apr 22 18:48:58.686185 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:58.686147 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" event={"ID":"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3","Type":"ContainerStarted","Data":"da535a93382daccd2fce114e074c909f5fa3ba77df583812339b1f9d78e51a54"} Apr 22 18:48:58.688312 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:58.687898 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" event={"ID":"e6379d65-65a4-43ce-90b3-22b4af6360dc","Type":"ContainerStarted","Data":"c681ffec7d9f2a7e70bfe50c3ab3cb1bc4172d95761213606ad544ccc7dc7e17"} Apr 22 18:48:58.704463 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:58.704071 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-98fl9" podStartSLOduration=1.70405587 podStartE2EDuration="1.70405587s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:58.703465589 +0000 UTC m=+135.034974042" watchObservedRunningTime="2026-04-22 18:48:58.70405587 +0000 UTC m=+135.035564324" Apr 22 18:48:58.872375 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:58.872334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:48:58.872577 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:58.872525 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:58.872651 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:58.872604 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls podName:82e68df6-e973-4eb4-9f72-57676895ca9b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.872582375 +0000 UTC m=+137.204090806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qhml5" (UID: "82e68df6-e973-4eb4-9f72-57676895ca9b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:48:58.973204 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:48:58.973115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:48:58.973357 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:58.973279 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:58.973357 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:58.973299 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-664bcd8945-zfwms: secret "image-registry-tls" not found Apr 22 18:48:58.973357 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:48:58.973354 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls podName:7413bf65-85db-4595-b685-fede8886d53f nodeName:}" failed. No retries permitted until 2026-04-22 18:49:00.973338136 +0000 UTC m=+137.304846566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls") pod "image-registry-664bcd8945-zfwms" (UID: "7413bf65-85db-4595-b685-fede8886d53f") : secret "image-registry-tls" not found Apr 22 18:49:00.692720 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.692684 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/0.log" Apr 22 18:49:00.693174 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.692730 2570 generic.go:358] "Generic (PLEG): container finished" podID="26e711ad-d134-4b34-9606-2a2cb1b1f283" containerID="e65f21d1068210692e07108db7f2e865b4287ac7e9418c6529348140234a9c54" exitCode=255 Apr 22 18:49:00.693174 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.692816 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" event={"ID":"26e711ad-d134-4b34-9606-2a2cb1b1f283","Type":"ContainerDied","Data":"e65f21d1068210692e07108db7f2e865b4287ac7e9418c6529348140234a9c54"} Apr 22 18:49:00.693174 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.693088 2570 scope.go:117] "RemoveContainer" containerID="e65f21d1068210692e07108db7f2e865b4287ac7e9418c6529348140234a9c54" Apr 22 18:49:00.694141 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.694117 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" event={"ID":"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3","Type":"ContainerStarted","Data":"13cc532866d9a4aa8e51bd96d6d976cf83ba55e7288d10fb3e341d31c5c7747b"} Apr 22 18:49:00.726664 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.726616 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" podStartSLOduration=1.6085511110000001 podStartE2EDuration="3.726600713s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="2026-04-22 18:48:57.787120208 +0000 UTC m=+134.118628652" lastFinishedPulling="2026-04-22 18:48:59.905169823 +0000 UTC m=+136.236678254" observedRunningTime="2026-04-22 18:49:00.725473514 +0000 UTC m=+137.056981967" watchObservedRunningTime="2026-04-22 18:49:00.726600713 +0000 UTC m=+137.058109166" Apr 22 18:49:00.889804 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.889774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:49:00.889941 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:00.889910 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:00.889990 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:00.889975 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls podName:82e68df6-e973-4eb4-9f72-57676895ca9b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:04.889959715 +0000 UTC m=+141.221468144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qhml5" (UID: "82e68df6-e973-4eb4-9f72-57676895ca9b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:00.990964 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:00.990893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:00.991075 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:00.990994 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:49:00.991075 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:00.991004 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-664bcd8945-zfwms: secret "image-registry-tls" not found Apr 22 18:49:00.991075 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:00.991065 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls podName:7413bf65-85db-4595-b685-fede8886d53f nodeName:}" failed. No retries permitted until 2026-04-22 18:49:04.991050589 +0000 UTC m=+141.322559018 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls") pod "image-registry-664bcd8945-zfwms" (UID: "7413bf65-85db-4595-b685-fede8886d53f") : secret "image-registry-tls" not found Apr 22 18:49:01.698356 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:01.698323 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/1.log" Apr 22 18:49:01.698760 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:01.698744 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/0.log" Apr 22 18:49:01.698809 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:01.698779 2570 generic.go:358] "Generic (PLEG): container finished" podID="26e711ad-d134-4b34-9606-2a2cb1b1f283" containerID="cc929cdd17b0c37e000da07474453e2d4da33e0f5855a0daadff2418e4559b80" exitCode=255 Apr 22 18:49:01.698903 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:01.698878 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" event={"ID":"26e711ad-d134-4b34-9606-2a2cb1b1f283","Type":"ContainerDied","Data":"cc929cdd17b0c37e000da07474453e2d4da33e0f5855a0daadff2418e4559b80"} Apr 22 18:49:01.698962 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:01.698930 2570 scope.go:117] "RemoveContainer" containerID="e65f21d1068210692e07108db7f2e865b4287ac7e9418c6529348140234a9c54" Apr 22 18:49:01.699140 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:01.699120 2570 scope.go:117] "RemoveContainer" containerID="cc929cdd17b0c37e000da07474453e2d4da33e0f5855a0daadff2418e4559b80" Apr 22 18:49:01.699345 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:01.699319 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nzr8k_openshift-console-operator(26e711ad-d134-4b34-9606-2a2cb1b1f283)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" podUID="26e711ad-d134-4b34-9606-2a2cb1b1f283" Apr 22 18:49:02.701887 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:02.701858 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/1.log" Apr 22 18:49:02.702293 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:02.702211 2570 scope.go:117] "RemoveContainer" containerID="cc929cdd17b0c37e000da07474453e2d4da33e0f5855a0daadff2418e4559b80" Apr 22 18:49:02.702385 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:02.702368 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nzr8k_openshift-console-operator(26e711ad-d134-4b34-9606-2a2cb1b1f283)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" podUID="26e711ad-d134-4b34-9606-2a2cb1b1f283" Apr 22 18:49:03.724511 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:03.724484 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cpq2l_e190ea4b-606c-4dd3-9785-eb0178af92e9/dns-node-resolver/0.log" Apr 22 18:49:04.034356 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.034282 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pddkv"] Apr 22 18:49:04.038008 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.037993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.040644 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.040623 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:49:04.040953 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.040937 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:49:04.041049 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.040961 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:49:04.042160 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.042141 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:49:04.042229 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.042146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-vz4db\"" Apr 22 18:49:04.045737 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.045718 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pddkv"] Apr 22 18:49:04.215865 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.215828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-signing-cabundle\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.216045 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.215935 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-signing-key\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.216045 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.215969 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvghq\" (UniqueName: \"kubernetes.io/projected/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-kube-api-access-bvghq\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.317111 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.316996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvghq\" (UniqueName: \"kubernetes.io/projected/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-kube-api-access-bvghq\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.317111 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.317098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-signing-cabundle\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.317309 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.317168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-signing-key\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.317785 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.317762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-signing-cabundle\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.319486 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.319465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-signing-key\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.325820 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.325796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvghq\" (UniqueName: \"kubernetes.io/projected/e4d87dfb-aa7e-4a01-8679-afb5a24d4914-kube-api-access-bvghq\") pod \"service-ca-865cb79987-pddkv\" (UID: \"e4d87dfb-aa7e-4a01-8679-afb5a24d4914\") " pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.347775 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.347750 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pddkv" Apr 22 18:49:04.465056 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.465003 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pddkv"] Apr 22 18:49:04.467841 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:04.467811 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4d87dfb_aa7e_4a01_8679_afb5a24d4914.slice/crio-a63188b35cafe1dd8001aa7f3124d17b10718730c276ec8c4ac38f6c8d2964b1 WatchSource:0}: Error finding container a63188b35cafe1dd8001aa7f3124d17b10718730c276ec8c4ac38f6c8d2964b1: Status 404 returned error can't find the container with id a63188b35cafe1dd8001aa7f3124d17b10718730c276ec8c4ac38f6c8d2964b1 Apr 22 18:49:04.706952 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.706916 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pddkv" event={"ID":"e4d87dfb-aa7e-4a01-8679-afb5a24d4914","Type":"ContainerStarted","Data":"a63188b35cafe1dd8001aa7f3124d17b10718730c276ec8c4ac38f6c8d2964b1"} Apr 22 18:49:04.724673 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.724652 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lxsx5_0484c97f-ca09-4491-bd36-1cd68e364f27/node-ca/0.log" Apr 22 18:49:04.922891 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:04.922854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:49:04.923098 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:04.923046 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:04.923159 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:04.923126 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls podName:82e68df6-e973-4eb4-9f72-57676895ca9b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:12.923104228 +0000 UTC m=+149.254612671 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qhml5" (UID: "82e68df6-e973-4eb4-9f72-57676895ca9b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:05.023599 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:05.023519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:05.023744 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:05.023617 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:49:05.023744 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:05.023635 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-664bcd8945-zfwms: secret "image-registry-tls" not found Apr 22 18:49:05.023744 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:05.023695 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls podName:7413bf65-85db-4595-b685-fede8886d53f nodeName:}" failed. No retries permitted until 2026-04-22 18:49:13.023678114 +0000 UTC m=+149.355186546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls") pod "image-registry-664bcd8945-zfwms" (UID: "7413bf65-85db-4595-b685-fede8886d53f") : secret "image-registry-tls" not found Apr 22 18:49:06.713815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:06.713780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pddkv" event={"ID":"e4d87dfb-aa7e-4a01-8679-afb5a24d4914","Type":"ContainerStarted","Data":"dbd8481befc61ad8722946af9e191dfc5adfe3cfd577cedd2dfe6fca0eb85e75"} Apr 22 18:49:06.741428 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:06.741374 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-pddkv" podStartSLOduration=1.176003156 podStartE2EDuration="2.741359475s" podCreationTimestamp="2026-04-22 18:49:04 +0000 UTC" firstStartedPulling="2026-04-22 18:49:04.469670109 +0000 UTC m=+140.801178539" lastFinishedPulling="2026-04-22 18:49:06.03502641 +0000 UTC m=+142.366534858" observedRunningTime="2026-04-22 18:49:06.741235851 +0000 UTC m=+143.072744308" watchObservedRunningTime="2026-04-22 18:49:06.741359475 +0000 UTC m=+143.072867926" Apr 22 18:49:07.565389 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:07.565349 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:49:07.565389 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:07.565385 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:49:07.565759 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:07.565745 2570 scope.go:117] "RemoveContainer" containerID="cc929cdd17b0c37e000da07474453e2d4da33e0f5855a0daadff2418e4559b80" Apr 22 18:49:07.565917 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:07.565901 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nzr8k_openshift-console-operator(26e711ad-d134-4b34-9606-2a2cb1b1f283)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" podUID="26e711ad-d134-4b34-9606-2a2cb1b1f283" Apr 22 18:49:12.991209 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:12.991170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:49:12.991579 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:12.991321 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:12.991579 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:12.991391 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls podName:82e68df6-e973-4eb4-9f72-57676895ca9b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:28.991375084 +0000 UTC m=+165.322883514 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qhml5" (UID: "82e68df6-e973-4eb4-9f72-57676895ca9b") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:49:13.092182 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.092146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:13.094481 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.094461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"image-registry-664bcd8945-zfwms\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:13.170526 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.170478 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:13.287380 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.287342 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-664bcd8945-zfwms"] Apr 22 18:49:13.291611 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:13.291584 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7413bf65_85db_4595_b685_fede8886d53f.slice/crio-a659d948d958f4b40a33bbc13edaf221698d0cfceb28b54200043afff88120ce WatchSource:0}: Error finding container a659d948d958f4b40a33bbc13edaf221698d0cfceb28b54200043afff88120ce: Status 404 returned error can't find the container with id a659d948d958f4b40a33bbc13edaf221698d0cfceb28b54200043afff88120ce Apr 22 18:49:13.732149 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.732104 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" event={"ID":"7413bf65-85db-4595-b685-fede8886d53f","Type":"ContainerStarted","Data":"b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b"} Apr 22 18:49:13.732149 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.732154 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" event={"ID":"7413bf65-85db-4595-b685-fede8886d53f","Type":"ContainerStarted","Data":"a659d948d958f4b40a33bbc13edaf221698d0cfceb28b54200043afff88120ce"} Apr 22 18:49:13.732353 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.732246 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:13.756149 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:13.756095 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" podStartSLOduration=16.756080906 podStartE2EDuration="16.756080906s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:13.755197962 +0000 UTC m=+150.086706417" watchObservedRunningTime="2026-04-22 18:49:13.756080906 +0000 UTC m=+150.087589407" Apr 22 18:49:18.565168 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:18.565119 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6zgnq" podUID="030083d5-b0a7-4438-a7f6-06eae3c80777" Apr 22 18:49:18.577254 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:18.577221 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-n4k5q" podUID="b72ffd58-b0b5-48d0-b56c-8d4e7f30307b" Apr 22 18:49:18.743330 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:18.743300 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zgnq" Apr 22 18:49:20.263528 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:20.263492 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mf94f" podUID="383fe532-b742-451a-8a94-dc5c7fd3fce5" Apr 22 18:49:22.244520 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:22.244444 2570 scope.go:117] "RemoveContainer" containerID="cc929cdd17b0c37e000da07474453e2d4da33e0f5855a0daadff2418e4559b80" Apr 22 18:49:22.753204 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:22.753171 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 18:49:22.753516 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:22.753501 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/1.log" Apr 22 18:49:22.753569 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:22.753533 2570 generic.go:358] "Generic (PLEG): container finished" podID="26e711ad-d134-4b34-9606-2a2cb1b1f283" containerID="8457aa13aafc0a4a005fb964f934d41e44142d413008155f6e51b861936a8d80" exitCode=255 Apr 22 18:49:22.753569 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:22.753560 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" event={"ID":"26e711ad-d134-4b34-9606-2a2cb1b1f283","Type":"ContainerDied","Data":"8457aa13aafc0a4a005fb964f934d41e44142d413008155f6e51b861936a8d80"} Apr 22 18:49:22.753655 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:22.753588 2570 scope.go:117] "RemoveContainer" containerID="cc929cdd17b0c37e000da07474453e2d4da33e0f5855a0daadff2418e4559b80" Apr 22 18:49:22.753960 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:22.753937 2570 scope.go:117] "RemoveContainer" containerID="8457aa13aafc0a4a005fb964f934d41e44142d413008155f6e51b861936a8d80" Apr 22 18:49:22.754178 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:22.754157 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-nzr8k_openshift-console-operator(26e711ad-d134-4b34-9606-2a2cb1b1f283)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" podUID="26e711ad-d134-4b34-9606-2a2cb1b1f283" Apr 22 18:49:23.470667 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.470629 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:49:23.470667 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.470669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:49:23.473113 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.473090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/030083d5-b0a7-4438-a7f6-06eae3c80777-metrics-tls\") pod \"dns-default-6zgnq\" (UID: \"030083d5-b0a7-4438-a7f6-06eae3c80777\") " pod="openshift-dns/dns-default-6zgnq" Apr 22 18:49:23.473209 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.473133 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b72ffd58-b0b5-48d0-b56c-8d4e7f30307b-cert\") pod \"ingress-canary-n4k5q\" (UID: \"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b\") " pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:49:23.546102 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.546074 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w2ls6\"" Apr 22 18:49:23.554160 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.554139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zgnq" Apr 22 18:49:23.672374 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.672346 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6zgnq"] Apr 22 18:49:23.675232 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:23.675208 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod030083d5_b0a7_4438_a7f6_06eae3c80777.slice/crio-c15a6b50daf78d4b00ad32042c66d6ab3a50e80a4c5d92c904eabcbce2ce38f0 WatchSource:0}: Error finding container c15a6b50daf78d4b00ad32042c66d6ab3a50e80a4c5d92c904eabcbce2ce38f0: Status 404 returned error can't find the container with id c15a6b50daf78d4b00ad32042c66d6ab3a50e80a4c5d92c904eabcbce2ce38f0 Apr 22 18:49:23.757832 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.757751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zgnq" event={"ID":"030083d5-b0a7-4438-a7f6-06eae3c80777","Type":"ContainerStarted","Data":"c15a6b50daf78d4b00ad32042c66d6ab3a50e80a4c5d92c904eabcbce2ce38f0"} Apr 22 18:49:23.758982 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.758966 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 18:49:23.835902 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.835872 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-664bcd8945-zfwms"] Apr 22 18:49:23.838781 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.838755 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gntnd"] Apr 22 18:49:23.844538 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.844521 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.847914 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.847895 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:49:23.848009 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.847895 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:23.848009 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.847897 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:49:23.848009 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.847962 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:23.848273 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.848260 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-n2rqf\"" Apr 22 18:49:23.850227 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.850211 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gntnd"] Apr 22 18:49:23.874206 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.874184 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-crio-socket\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.874306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.874210 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.874306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.874236 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-data-volume\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.874306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.874251 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9jdd\" (UniqueName: \"kubernetes.io/projected/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-kube-api-access-q9jdd\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.874426 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.874362 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.881534 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.881515 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-558fc6cd4f-b58xq"] Apr 22 18:49:23.886952 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.886929 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.894814 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.894794 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-558fc6cd4f-b58xq"] Apr 22 18:49:23.975053 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.974969 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-registry-tls\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975207 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f24cbaf9-5758-415a-8259-120faaae9cf8-registry-certificates\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975207 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975102 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-bound-sa-token\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975207 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975135 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-crio-socket\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.975207 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975153 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.975207 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975187 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-data-volume\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.975444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9jdd\" (UniqueName: \"kubernetes.io/projected/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-kube-api-access-q9jdd\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.975444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f24cbaf9-5758-415a-8259-120faaae9cf8-installation-pull-secrets\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f24cbaf9-5758-415a-8259-120faaae9cf8-trusted-ca\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975256 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-crio-socket\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.975444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f24cbaf9-5758-415a-8259-120faaae9cf8-ca-trust-extracted\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975393 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f24cbaf9-5758-415a-8259-120faaae9cf8-image-registry-private-configuration\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975642 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975480 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.975642 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr88b\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-kube-api-access-wr88b\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:23.975642 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-data-volume\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.975773 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.975755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.977711 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.977694 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:23.998939 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:23.998915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9jdd\" (UniqueName: \"kubernetes.io/projected/3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b-kube-api-access-q9jdd\") pod \"insights-runtime-extractor-gntnd\" (UID: \"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b\") " pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:24.076487 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-bound-sa-token\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.076487 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f24cbaf9-5758-415a-8259-120faaae9cf8-installation-pull-secrets\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.076711 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076556 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f24cbaf9-5758-415a-8259-120faaae9cf8-trusted-ca\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.076711 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f24cbaf9-5758-415a-8259-120faaae9cf8-ca-trust-extracted\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.076711 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076614 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f24cbaf9-5758-415a-8259-120faaae9cf8-image-registry-private-configuration\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.076873 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076816 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr88b\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-kube-api-access-wr88b\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.076932 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076888 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-registry-tls\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.076932 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.076918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f24cbaf9-5758-415a-8259-120faaae9cf8-registry-certificates\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.077121 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.077098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f24cbaf9-5758-415a-8259-120faaae9cf8-ca-trust-extracted\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.077962 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.077940 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f24cbaf9-5758-415a-8259-120faaae9cf8-registry-certificates\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.078107 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.078087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f24cbaf9-5758-415a-8259-120faaae9cf8-trusted-ca\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.079473 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.079450 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f24cbaf9-5758-415a-8259-120faaae9cf8-image-registry-private-configuration\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.079554 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.079523 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f24cbaf9-5758-415a-8259-120faaae9cf8-installation-pull-secrets\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.080244 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.080227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-registry-tls\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.089850 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.089831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-bound-sa-token\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.097163 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.097143 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr88b\" (UniqueName: \"kubernetes.io/projected/f24cbaf9-5758-415a-8259-120faaae9cf8-kube-api-access-wr88b\") pod \"image-registry-558fc6cd4f-b58xq\" (UID: \"f24cbaf9-5758-415a-8259-120faaae9cf8\") " pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.156135 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.156106 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gntnd" Apr 22 18:49:24.195257 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.195227 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.303280 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.303250 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gntnd"] Apr 22 18:49:24.344115 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.344073 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-558fc6cd4f-b58xq"] Apr 22 18:49:24.347688 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:24.347631 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24cbaf9_5758_415a_8259_120faaae9cf8.slice/crio-2f62e0e731254b6cb7093d845e20fc94975cbfb2fa004f1f8b72df452b7128b2 WatchSource:0}: Error finding container 2f62e0e731254b6cb7093d845e20fc94975cbfb2fa004f1f8b72df452b7128b2: Status 404 returned error can't find the container with id 2f62e0e731254b6cb7093d845e20fc94975cbfb2fa004f1f8b72df452b7128b2 Apr 22 18:49:24.765642 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.765599 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gntnd" event={"ID":"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b","Type":"ContainerStarted","Data":"5a1c1cb6f9bc144d44f828dc5783aa7d9f914419073483de3bbd2d8e8205f427"} Apr 22 18:49:24.766132 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.765648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gntnd" event={"ID":"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b","Type":"ContainerStarted","Data":"4efc95aba9c95ca1f10533e7ef10bf319004c86eb31f8904427de519e3a54416"} Apr 22 18:49:24.767166 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.767119 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" event={"ID":"f24cbaf9-5758-415a-8259-120faaae9cf8","Type":"ContainerStarted","Data":"5025f1879989cdfc45c5bc0ff1661cecd306e75f782a45ae554739604f192269"} Apr 22 18:49:24.767166 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.767158 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" event={"ID":"f24cbaf9-5758-415a-8259-120faaae9cf8","Type":"ContainerStarted","Data":"2f62e0e731254b6cb7093d845e20fc94975cbfb2fa004f1f8b72df452b7128b2"} Apr 22 18:49:24.767398 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.767373 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:24.788906 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:24.788862 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" podStartSLOduration=1.78884441 podStartE2EDuration="1.78884441s" podCreationTimestamp="2026-04-22 18:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:24.787850387 +0000 UTC m=+161.119358840" watchObservedRunningTime="2026-04-22 18:49:24.78884441 +0000 UTC m=+161.120352866" Apr 22 18:49:25.773867 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:25.773781 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gntnd" event={"ID":"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b","Type":"ContainerStarted","Data":"f6881d692882e64507844bd15ddf88f0495c755d078aadd7c3ed26ca27bf6aa5"} Apr 22 18:49:25.775805 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:25.775771 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zgnq" event={"ID":"030083d5-b0a7-4438-a7f6-06eae3c80777","Type":"ContainerStarted","Data":"c7e82390f9df78f4245b816260012100041370ff0e7781b6529e0861dc31c281"} Apr 22 18:49:25.775950 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:25.775811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zgnq" event={"ID":"030083d5-b0a7-4438-a7f6-06eae3c80777","Type":"ContainerStarted","Data":"5714243c645db5a14adf85ff3a9ecdbe6cfa5e62364120cff192a2b9ee573842"} Apr 22 18:49:25.794681 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:25.794623 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6zgnq" podStartSLOduration=129.361562063 podStartE2EDuration="2m10.794605102s" podCreationTimestamp="2026-04-22 18:47:15 +0000 UTC" firstStartedPulling="2026-04-22 18:49:23.677590685 +0000 UTC m=+160.009099114" lastFinishedPulling="2026-04-22 18:49:25.11063372 +0000 UTC m=+161.442142153" observedRunningTime="2026-04-22 18:49:25.793753271 +0000 UTC m=+162.125261724" watchObservedRunningTime="2026-04-22 18:49:25.794605102 +0000 UTC m=+162.126113555" Apr 22 18:49:26.780058 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:26.780003 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gntnd" event={"ID":"3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b","Type":"ContainerStarted","Data":"d8c4c10a5bfcde49deef22dd0497700c5bdf85d8ad71092e426f1ecb6ab87d33"} Apr 22 18:49:26.780521 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:26.780162 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6zgnq" Apr 22 18:49:26.798613 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:26.798571 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gntnd" podStartSLOduration=1.6040073019999999 podStartE2EDuration="3.798557007s" podCreationTimestamp="2026-04-22 18:49:23 +0000 UTC" firstStartedPulling="2026-04-22 18:49:24.409091027 +0000 UTC m=+160.740599474" lastFinishedPulling="2026-04-22 18:49:26.603640743 +0000 UTC m=+162.935149179" observedRunningTime="2026-04-22 18:49:26.796827636 +0000 UTC m=+163.128336103" watchObservedRunningTime="2026-04-22 18:49:26.798557007 +0000 UTC m=+163.130065455" Apr 22 18:49:27.565256 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:27.565223 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:49:27.565256 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:27.565257 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:49:27.565578 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:27.565566 2570 scope.go:117] "RemoveContainer" containerID="8457aa13aafc0a4a005fb964f934d41e44142d413008155f6e51b861936a8d80" Apr 22 18:49:27.565734 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:27.565719 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-nzr8k_openshift-console-operator(26e711ad-d134-4b34-9606-2a2cb1b1f283)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" podUID="26e711ad-d134-4b34-9606-2a2cb1b1f283" Apr 22 18:49:29.018113 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:29.018078 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:49:29.020405 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:29.020384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e68df6-e973-4eb4-9f72-57676895ca9b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qhml5\" (UID: \"82e68df6-e973-4eb4-9f72-57676895ca9b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:49:29.254322 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:29.254293 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" Apr 22 18:49:29.383058 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:29.382985 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5"] Apr 22 18:49:29.386127 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:29.386099 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e68df6_e973_4eb4_9f72_57676895ca9b.slice/crio-88720556d4d4bd07f1403461f408f55efddd468aa2be012998d9ad37ad5ea1be WatchSource:0}: Error finding container 88720556d4d4bd07f1403461f408f55efddd468aa2be012998d9ad37ad5ea1be: Status 404 returned error can't find the container with id 88720556d4d4bd07f1403461f408f55efddd468aa2be012998d9ad37ad5ea1be Apr 22 18:49:29.790658 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:29.790620 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" event={"ID":"82e68df6-e973-4eb4-9f72-57676895ca9b","Type":"ContainerStarted","Data":"88720556d4d4bd07f1403461f408f55efddd468aa2be012998d9ad37ad5ea1be"} Apr 22 18:49:31.244293 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.244262 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:49:31.246972 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.246948 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q94s6\"" Apr 22 18:49:31.249925 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.249903 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8"] Apr 22 18:49:31.252793 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.252778 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:31.254864 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.254844 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n4k5q" Apr 22 18:49:31.255229 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.255210 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:49:31.255408 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.255231 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-shw7g\"" Apr 22 18:49:31.260274 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.260032 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8"] Apr 22 18:49:31.338820 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.338742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b7b9bf0c-4a86-427f-881b-3915077f0c2d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gm7d8\" (UID: \"b7b9bf0c-4a86-427f-881b-3915077f0c2d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:31.367848 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.367822 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n4k5q"] Apr 22 18:49:31.370644 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:31.370619 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb72ffd58_b0b5_48d0_b56c_8d4e7f30307b.slice/crio-9a8fab6cad89b20102152cd08eb99da33e82c8031b42c371249b3a0b822eb8ee WatchSource:0}: Error finding container 9a8fab6cad89b20102152cd08eb99da33e82c8031b42c371249b3a0b822eb8ee: Status 404 returned error can't find the container with id 9a8fab6cad89b20102152cd08eb99da33e82c8031b42c371249b3a0b822eb8ee Apr 22 18:49:31.439331 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.439303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b7b9bf0c-4a86-427f-881b-3915077f0c2d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gm7d8\" (UID: \"b7b9bf0c-4a86-427f-881b-3915077f0c2d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:31.439445 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:31.439419 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:49:31.439504 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:31.439490 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b9bf0c-4a86-427f-881b-3915077f0c2d-tls-certificates podName:b7b9bf0c-4a86-427f-881b-3915077f0c2d nodeName:}" failed. No retries permitted until 2026-04-22 18:49:31.939469531 +0000 UTC m=+168.270977981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/b7b9bf0c-4a86-427f-881b-3915077f0c2d-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-gm7d8" (UID: "b7b9bf0c-4a86-427f-881b-3915077f0c2d") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:49:31.797412 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.797373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n4k5q" event={"ID":"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b","Type":"ContainerStarted","Data":"9a8fab6cad89b20102152cd08eb99da33e82c8031b42c371249b3a0b822eb8ee"} Apr 22 18:49:31.798867 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.798837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" event={"ID":"82e68df6-e973-4eb4-9f72-57676895ca9b","Type":"ContainerStarted","Data":"41dbc04d53d9e14d4556541a32d51fe8f6ca0e3b6f3af8c2e1066b4e44da1e4f"} Apr 22 18:49:31.816374 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.816319 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qhml5" podStartSLOduration=33.461236393 podStartE2EDuration="34.816302831s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="2026-04-22 18:49:29.387876304 +0000 UTC m=+165.719384733" lastFinishedPulling="2026-04-22 18:49:30.742942727 +0000 UTC m=+167.074451171" observedRunningTime="2026-04-22 18:49:31.8155424 +0000 UTC m=+168.147050855" watchObservedRunningTime="2026-04-22 18:49:31.816302831 +0000 UTC m=+168.147811282" Apr 22 18:49:31.942682 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.942647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b7b9bf0c-4a86-427f-881b-3915077f0c2d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gm7d8\" (UID: \"b7b9bf0c-4a86-427f-881b-3915077f0c2d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:31.945260 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:31.945235 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b7b9bf0c-4a86-427f-881b-3915077f0c2d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gm7d8\" (UID: \"b7b9bf0c-4a86-427f-881b-3915077f0c2d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:32.174129 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:32.174091 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:32.303243 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:32.303205 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8"] Apr 22 18:49:32.307082 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:32.307049 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b9bf0c_4a86_427f_881b_3915077f0c2d.slice/crio-1d15436f0dc9fc6a45ce4dd2dc0198df53320606724a1f4a72e8130883483b5b WatchSource:0}: Error finding container 1d15436f0dc9fc6a45ce4dd2dc0198df53320606724a1f4a72e8130883483b5b: Status 404 returned error can't find the container with id 1d15436f0dc9fc6a45ce4dd2dc0198df53320606724a1f4a72e8130883483b5b Apr 22 18:49:32.803965 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:32.803909 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" event={"ID":"b7b9bf0c-4a86-427f-881b-3915077f0c2d","Type":"ContainerStarted","Data":"1d15436f0dc9fc6a45ce4dd2dc0198df53320606724a1f4a72e8130883483b5b"} Apr 22 18:49:33.807597 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.807513 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n4k5q" event={"ID":"b72ffd58-b0b5-48d0-b56c-8d4e7f30307b","Type":"ContainerStarted","Data":"444f894d89c7eb7dfd0738837acf5536540e253ec3782a4853ba57d85f7e842b"} Apr 22 18:49:33.808874 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.808849 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" event={"ID":"b7b9bf0c-4a86-427f-881b-3915077f0c2d","Type":"ContainerStarted","Data":"9467fe8b8c1e1e9521984fe4b9f54bf5f87b202f68acf1edbb9e90312d58c4d3"} Apr 22 18:49:33.809049 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.809037 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:33.813427 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.813405 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" Apr 22 18:49:33.824135 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.824095 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n4k5q" podStartSLOduration=137.321876484 podStartE2EDuration="2m18.824083851s" podCreationTimestamp="2026-04-22 18:47:15 +0000 UTC" firstStartedPulling="2026-04-22 18:49:31.372758962 +0000 UTC m=+167.704267392" lastFinishedPulling="2026-04-22 18:49:32.874966318 +0000 UTC m=+169.206474759" observedRunningTime="2026-04-22 18:49:33.821981026 +0000 UTC m=+170.153489472" watchObservedRunningTime="2026-04-22 18:49:33.824083851 +0000 UTC m=+170.155592302" Apr 22 18:49:33.838307 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.838267 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gm7d8" podStartSLOduration=1.602118066 podStartE2EDuration="2.838254145s" podCreationTimestamp="2026-04-22 18:49:31 +0000 UTC" firstStartedPulling="2026-04-22 18:49:32.309081542 +0000 UTC m=+168.640589976" lastFinishedPulling="2026-04-22 18:49:33.545217625 +0000 UTC m=+169.876726055" observedRunningTime="2026-04-22 18:49:33.83740002 +0000 UTC m=+170.168908473" watchObservedRunningTime="2026-04-22 18:49:33.838254145 +0000 UTC m=+170.169762620" Apr 22 18:49:33.840898 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.840874 2570 patch_prober.go:28] interesting pod/image-registry-664bcd8945-zfwms container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:33.840976 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:33.840919 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" podUID="7413bf65-85db-4595-b685-fede8886d53f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:34.249908 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.249877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:49:34.316927 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.316897 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mxhqc"] Apr 22 18:49:34.320117 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.320097 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.323790 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.323769 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:49:34.323925 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.323841 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:49:34.323925 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.323841 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qfgx9\"" Apr 22 18:49:34.323925 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.323841 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:34.327534 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.327516 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mxhqc"] Apr 22 18:49:34.464925 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.464873 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-792rq\" (UniqueName: \"kubernetes.io/projected/47dff405-7883-453c-b066-b2b17d2440a6-kube-api-access-792rq\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.465105 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.464944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/47dff405-7883-453c-b066-b2b17d2440a6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.465105 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.464974 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47dff405-7883-453c-b066-b2b17d2440a6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.465105 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.465092 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47dff405-7883-453c-b066-b2b17d2440a6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.566037 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.565913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47dff405-7883-453c-b066-b2b17d2440a6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.566037 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.565979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-792rq\" (UniqueName: \"kubernetes.io/projected/47dff405-7883-453c-b066-b2b17d2440a6-kube-api-access-792rq\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.566037 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.566008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/47dff405-7883-453c-b066-b2b17d2440a6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.566317 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.566068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47dff405-7883-453c-b066-b2b17d2440a6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.567588 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.567563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47dff405-7883-453c-b066-b2b17d2440a6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.568419 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.568392 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47dff405-7883-453c-b066-b2b17d2440a6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.568520 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.568476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/47dff405-7883-453c-b066-b2b17d2440a6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.575219 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.575197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-792rq\" (UniqueName: \"kubernetes.io/projected/47dff405-7883-453c-b066-b2b17d2440a6-kube-api-access-792rq\") pod \"prometheus-operator-5676c8c784-mxhqc\" (UID: \"47dff405-7883-453c-b066-b2b17d2440a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.630241 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.630204 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" Apr 22 18:49:34.744105 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.744071 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-mxhqc"] Apr 22 18:49:34.747287 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:34.747261 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47dff405_7883_453c_b066_b2b17d2440a6.slice/crio-9d7678592287d0faacaecce3967ca0ce8d6864e0edf5ecf7a6a8ce1dc001a30a WatchSource:0}: Error finding container 9d7678592287d0faacaecce3967ca0ce8d6864e0edf5ecf7a6a8ce1dc001a30a: Status 404 returned error can't find the container with id 9d7678592287d0faacaecce3967ca0ce8d6864e0edf5ecf7a6a8ce1dc001a30a Apr 22 18:49:34.812568 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:34.812525 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" event={"ID":"47dff405-7883-453c-b066-b2b17d2440a6","Type":"ContainerStarted","Data":"9d7678592287d0faacaecce3967ca0ce8d6864e0edf5ecf7a6a8ce1dc001a30a"} Apr 22 18:49:36.785117 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:36.785087 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6zgnq" Apr 22 18:49:36.818882 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:36.818853 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" event={"ID":"47dff405-7883-453c-b066-b2b17d2440a6","Type":"ContainerStarted","Data":"17bc8ef5898ad036f57487d9c530b265cd46f670187fa50b599add3601b734f5"} Apr 22 18:49:36.818882 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:36.818886 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" event={"ID":"47dff405-7883-453c-b066-b2b17d2440a6","Type":"ContainerStarted","Data":"5fb61936a069700b2b4200dc1357101ad2505c81fc810f165efadf21b154e7f4"} Apr 22 18:49:36.835730 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:36.835688 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-mxhqc" podStartSLOduration=1.7690970959999999 podStartE2EDuration="2.835673752s" podCreationTimestamp="2026-04-22 18:49:34 +0000 UTC" firstStartedPulling="2026-04-22 18:49:34.749470881 +0000 UTC m=+171.080979311" lastFinishedPulling="2026-04-22 18:49:35.816047537 +0000 UTC m=+172.147555967" observedRunningTime="2026-04-22 18:49:36.834798219 +0000 UTC m=+173.166306673" watchObservedRunningTime="2026-04-22 18:49:36.835673752 +0000 UTC m=+173.167182201" Apr 22 18:49:38.748272 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.748236 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5bflg"] Apr 22 18:49:38.751825 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.751801 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.754459 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.754436 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:38.754572 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.754553 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mrmnz\"" Apr 22 18:49:38.754785 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.754700 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:38.754785 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.754765 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:38.901197 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901158 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901377 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901212 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-wtmp\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901377 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-accelerators-collector-config\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901377 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901303 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46d280c1-d49f-4d77-8930-94aad7b2b5e2-metrics-client-ca\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901377 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901332 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-tls\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901377 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-sys\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901622 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901382 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwtzd\" (UniqueName: \"kubernetes.io/projected/46d280c1-d49f-4d77-8930-94aad7b2b5e2-kube-api-access-rwtzd\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901622 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-textfile\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:38.901622 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:38.901454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-root\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002562 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46d280c1-d49f-4d77-8930-94aad7b2b5e2-metrics-client-ca\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002562 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-tls\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002562 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002534 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-sys\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002562 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002552 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwtzd\" (UniqueName: \"kubernetes.io/projected/46d280c1-d49f-4d77-8930-94aad7b2b5e2-kube-api-access-rwtzd\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002593 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-textfile\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002613 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-root\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-sys\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002651 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-wtmp\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.002875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.002755 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-accelerators-collector-config\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.003192 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.003069 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-root\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.003192 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.003128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-wtmp\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.003289 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.003257 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-accelerators-collector-config\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.003372 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.003351 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46d280c1-d49f-4d77-8930-94aad7b2b5e2-metrics-client-ca\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.003556 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.003511 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-textfile\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.005231 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.005212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.005526 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.005506 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/46d280c1-d49f-4d77-8930-94aad7b2b5e2-node-exporter-tls\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.011490 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.011453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwtzd\" (UniqueName: \"kubernetes.io/projected/46d280c1-d49f-4d77-8930-94aad7b2b5e2-kube-api-access-rwtzd\") pod \"node-exporter-5bflg\" (UID: \"46d280c1-d49f-4d77-8930-94aad7b2b5e2\") " pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.061187 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.061165 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5bflg" Apr 22 18:49:39.068814 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:39.068789 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d280c1_d49f_4d77_8930_94aad7b2b5e2.slice/crio-990cb591119d1b4e0e75341588dc99368c00c973790c22bb8396212f427f7e43 WatchSource:0}: Error finding container 990cb591119d1b4e0e75341588dc99368c00c973790c22bb8396212f427f7e43: Status 404 returned error can't find the container with id 990cb591119d1b4e0e75341588dc99368c00c973790c22bb8396212f427f7e43 Apr 22 18:49:39.723871 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.723812 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:39.728568 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.728545 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.731512 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.731469 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:49:39.731884 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.731831 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:49:39.732034 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.732003 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:49:39.732114 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.732035 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dnr65\"" Apr 22 18:49:39.732283 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.732257 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:49:39.732370 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.732290 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:49:39.732733 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.732494 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:49:39.732733 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.732515 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:49:39.732733 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.732498 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:49:39.733627 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.733447 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:49:39.744979 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.744796 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:39.810414 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810374 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810433 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810537 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-web-config\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-out\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsrvp\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-kube-api-access-nsrvp\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810729 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.810868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.811315 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.811315 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810945 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.811315 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.810987 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.811315 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.811050 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-volume\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.828341 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.828317 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5bflg" event={"ID":"46d280c1-d49f-4d77-8930-94aad7b2b5e2","Type":"ContainerStarted","Data":"990cb591119d1b4e0e75341588dc99368c00c973790c22bb8396212f427f7e43"} Apr 22 18:49:39.911802 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-web-config\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.911904 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-out\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.911904 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911831 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsrvp\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-kube-api-access-nsrvp\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.911904 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911860 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.911904 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912125 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911932 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912125 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912125 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.911979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912125 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.912004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912125 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.912071 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912125 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.912102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-volume\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912421 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.912138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912421 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.912184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912930 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.912750 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.912930 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.912897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.914704 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.914477 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.914966 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.914943 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-out\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.915358 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.915255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-web-config\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.915454 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.915404 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.915550 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.915526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.915830 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.915796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.916082 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.916061 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.916793 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.916454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-volume\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.917384 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.917356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.917483 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.917462 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:39.928275 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:39.928252 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsrvp\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-kube-api-access-nsrvp\") pod \"alertmanager-main-0\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:40.040747 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:40.040709 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:49:40.171080 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:40.171039 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:49:40.174086 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:40.174056 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd39edff9_1df2_4987_9004_e7da2c37c7eb.slice/crio-dbd5ae042abf4a51ff6a8c0d8312817be2f3180dbd5db5f6951147ff74b09a79 WatchSource:0}: Error finding container dbd5ae042abf4a51ff6a8c0d8312817be2f3180dbd5db5f6951147ff74b09a79: Status 404 returned error can't find the container with id dbd5ae042abf4a51ff6a8c0d8312817be2f3180dbd5db5f6951147ff74b09a79 Apr 22 18:49:40.832189 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:40.832152 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerStarted","Data":"dbd5ae042abf4a51ff6a8c0d8312817be2f3180dbd5db5f6951147ff74b09a79"} Apr 22 18:49:40.833833 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:40.833801 2570 generic.go:358] "Generic (PLEG): container finished" podID="46d280c1-d49f-4d77-8930-94aad7b2b5e2" containerID="2eb62efd471c7b9e2d968dfd1773966dadddfe4bd84f053677608fa0ee99edcc" exitCode=0 Apr 22 18:49:40.833952 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:40.833847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5bflg" event={"ID":"46d280c1-d49f-4d77-8930-94aad7b2b5e2","Type":"ContainerDied","Data":"2eb62efd471c7b9e2d968dfd1773966dadddfe4bd84f053677608fa0ee99edcc"} Apr 22 18:49:41.244623 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:41.244597 2570 scope.go:117] "RemoveContainer" containerID="8457aa13aafc0a4a005fb964f934d41e44142d413008155f6e51b861936a8d80" Apr 22 18:49:41.244784 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:41.244771 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-nzr8k_openshift-console-operator(26e711ad-d134-4b34-9606-2a2cb1b1f283)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" podUID="26e711ad-d134-4b34-9606-2a2cb1b1f283" Apr 22 18:49:41.838229 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:41.838196 2570 generic.go:358] "Generic (PLEG): container finished" podID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerID="be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e" exitCode=0 Apr 22 18:49:41.838686 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:41.838274 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e"} Apr 22 18:49:41.840412 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:41.840389 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5bflg" event={"ID":"46d280c1-d49f-4d77-8930-94aad7b2b5e2","Type":"ContainerStarted","Data":"5bca4d4352277dc41bc1bde694647336f641ab3dc3244632194b53322c60cb7a"} Apr 22 18:49:41.840510 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:41.840419 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5bflg" event={"ID":"46d280c1-d49f-4d77-8930-94aad7b2b5e2","Type":"ContainerStarted","Data":"e21f7e9725978baad95c025053075ed292a99afba70a9058ab56f56304f02ac0"} Apr 22 18:49:41.888828 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:41.888771 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5bflg" podStartSLOduration=3.076323422 podStartE2EDuration="3.888753812s" podCreationTimestamp="2026-04-22 18:49:38 +0000 UTC" firstStartedPulling="2026-04-22 18:49:39.07051422 +0000 UTC m=+175.402022654" lastFinishedPulling="2026-04-22 18:49:39.882944614 +0000 UTC m=+176.214453044" observedRunningTime="2026-04-22 18:49:41.888141207 +0000 UTC m=+178.219649660" watchObservedRunningTime="2026-04-22 18:49:41.888753812 +0000 UTC m=+178.220262266" Apr 22 18:49:43.015954 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.015930 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fc6d96979-n9gdr"] Apr 22 18:49:43.021534 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.020316 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.024404 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.024385 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-fwhvd\"" Apr 22 18:49:43.024505 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.024407 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:49:43.024565 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.024506 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:49:43.024882 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.024857 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2qeta9j20clov\"" Apr 22 18:49:43.024963 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.024879 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:49:43.024963 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.024953 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:49:43.030217 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.030196 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fc6d96979-n9gdr"] Apr 22 18:49:43.139250 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.139219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ba1c7391-3310-4c4f-84cd-552e47593e99-audit-log\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.139345 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.139256 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ba1c7391-3310-4c4f-84cd-552e47593e99-metrics-server-audit-profiles\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.139345 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.139286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-client-ca-bundle\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.139468 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.139368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-secret-metrics-server-tls\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.139468 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.139400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-secret-metrics-server-client-certs\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.139468 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.139466 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jdg6\" (UniqueName: \"kubernetes.io/projected/ba1c7391-3310-4c4f-84cd-552e47593e99-kube-api-access-8jdg6\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.139557 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.139499 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba1c7391-3310-4c4f-84cd-552e47593e99-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240331 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba1c7391-3310-4c4f-84cd-552e47593e99-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240423 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ba1c7391-3310-4c4f-84cd-552e47593e99-audit-log\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240423 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ba1c7391-3310-4c4f-84cd-552e47593e99-metrics-server-audit-profiles\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240423 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240389 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-client-ca-bundle\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240582 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-secret-metrics-server-tls\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240582 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240444 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-secret-metrics-server-client-certs\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240582 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jdg6\" (UniqueName: \"kubernetes.io/projected/ba1c7391-3310-4c4f-84cd-552e47593e99-kube-api-access-8jdg6\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.240929 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.240900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ba1c7391-3310-4c4f-84cd-552e47593e99-audit-log\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.241238 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.241214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba1c7391-3310-4c4f-84cd-552e47593e99-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.241776 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.241756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ba1c7391-3310-4c4f-84cd-552e47593e99-metrics-server-audit-profiles\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.242773 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.242753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-client-ca-bundle\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.242863 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.242827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-secret-metrics-server-tls\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.242963 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.242947 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ba1c7391-3310-4c4f-84cd-552e47593e99-secret-metrics-server-client-certs\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.252466 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.252445 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jdg6\" (UniqueName: \"kubernetes.io/projected/ba1c7391-3310-4c4f-84cd-552e47593e99-kube-api-access-8jdg6\") pod \"metrics-server-5fc6d96979-n9gdr\" (UID: \"ba1c7391-3310-4c4f-84cd-552e47593e99\") " pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.332102 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.332072 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:49:43.456993 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.456959 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fc6d96979-n9gdr"] Apr 22 18:49:43.459888 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:43.459863 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba1c7391_3310_4c4f_84cd_552e47593e99.slice/crio-34c935b4e7a9a51967024ee122a80adec7192ee735314ed1ba68033e2c4270a5 WatchSource:0}: Error finding container 34c935b4e7a9a51967024ee122a80adec7192ee735314ed1ba68033e2c4270a5: Status 404 returned error can't find the container with id 34c935b4e7a9a51967024ee122a80adec7192ee735314ed1ba68033e2c4270a5 Apr 22 18:49:43.466058 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.466031 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7"] Apr 22 18:49:43.470878 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.470863 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" Apr 22 18:49:43.473433 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.473411 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-mqwwn\"" Apr 22 18:49:43.473529 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.473457 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:49:43.479181 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.479161 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7"] Apr 22 18:49:43.643430 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.643399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d50ce585-dd21-4431-b64f-6f4220eb4fab-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zm2s7\" (UID: \"d50ce585-dd21-4431-b64f-6f4220eb4fab\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" Apr 22 18:49:43.744758 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.744678 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d50ce585-dd21-4431-b64f-6f4220eb4fab-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zm2s7\" (UID: \"d50ce585-dd21-4431-b64f-6f4220eb4fab\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" Apr 22 18:49:43.747455 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.747428 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d50ce585-dd21-4431-b64f-6f4220eb4fab-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zm2s7\" (UID: \"d50ce585-dd21-4431-b64f-6f4220eb4fab\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" Apr 22 18:49:43.781042 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.780976 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" Apr 22 18:49:43.842206 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.842178 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:43.857429 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.857387 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerStarted","Data":"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12"} Apr 22 18:49:43.857429 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.857424 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerStarted","Data":"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03"} Apr 22 18:49:43.857429 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.857436 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerStarted","Data":"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c"} Apr 22 18:49:43.857665 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.857445 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerStarted","Data":"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09"} Apr 22 18:49:43.857665 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.857453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerStarted","Data":"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4"} Apr 22 18:49:43.858538 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.858514 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" event={"ID":"ba1c7391-3310-4c4f-84cd-552e47593e99","Type":"ContainerStarted","Data":"34c935b4e7a9a51967024ee122a80adec7192ee735314ed1ba68033e2c4270a5"} Apr 22 18:49:43.926139 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:43.926107 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7"] Apr 22 18:49:43.996813 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:43.996737 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50ce585_dd21_4431_b64f_6f4220eb4fab.slice/crio-c8b60d17c7ae791555109a0aaced3f126bcfbf83636784e5af73f48c277d83ec WatchSource:0}: Error finding container c8b60d17c7ae791555109a0aaced3f126bcfbf83636784e5af73f48c277d83ec: Status 404 returned error can't find the container with id c8b60d17c7ae791555109a0aaced3f126bcfbf83636784e5af73f48c277d83ec Apr 22 18:49:44.867469 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:44.867432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerStarted","Data":"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70"} Apr 22 18:49:44.868800 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:44.868771 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" event={"ID":"d50ce585-dd21-4431-b64f-6f4220eb4fab","Type":"ContainerStarted","Data":"c8b60d17c7ae791555109a0aaced3f126bcfbf83636784e5af73f48c277d83ec"} Apr 22 18:49:44.902067 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:44.902008 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.035979131 podStartE2EDuration="5.901991237s" podCreationTimestamp="2026-04-22 18:49:39 +0000 UTC" firstStartedPulling="2026-04-22 18:49:40.175976906 +0000 UTC m=+176.507485340" lastFinishedPulling="2026-04-22 18:49:44.041989015 +0000 UTC m=+180.373497446" observedRunningTime="2026-04-22 18:49:44.899412328 +0000 UTC m=+181.230920790" watchObservedRunningTime="2026-04-22 18:49:44.901991237 +0000 UTC m=+181.233499688" Apr 22 18:49:45.780599 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:45.780571 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-558fc6cd4f-b58xq" Apr 22 18:49:45.872530 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:45.872492 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" event={"ID":"d50ce585-dd21-4431-b64f-6f4220eb4fab","Type":"ContainerStarted","Data":"67b3ab32519caf4d1518961a30409776db2e34345992abd06cb34b459b5557fb"} Apr 22 18:49:45.872938 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:45.872542 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" Apr 22 18:49:45.873918 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:45.873882 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" event={"ID":"ba1c7391-3310-4c4f-84cd-552e47593e99","Type":"ContainerStarted","Data":"d685cc00a4866efe358669bf37f3d69c171356ac30937b7604e11ab898eead2e"} Apr 22 18:49:45.877219 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:45.877196 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" Apr 22 18:49:45.890331 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:45.890286 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zm2s7" podStartSLOduration=1.49541756 podStartE2EDuration="2.890274253s" podCreationTimestamp="2026-04-22 18:49:43 +0000 UTC" firstStartedPulling="2026-04-22 18:49:43.998834714 +0000 UTC m=+180.330343145" lastFinishedPulling="2026-04-22 18:49:45.393691395 +0000 UTC m=+181.725199838" observedRunningTime="2026-04-22 18:49:45.88888109 +0000 UTC m=+182.220389543" watchObservedRunningTime="2026-04-22 18:49:45.890274253 +0000 UTC m=+182.221782705" Apr 22 18:49:45.907443 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:45.907405 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" podStartSLOduration=2.021606849 podStartE2EDuration="3.907393529s" podCreationTimestamp="2026-04-22 18:49:42 +0000 UTC" firstStartedPulling="2026-04-22 18:49:43.46175999 +0000 UTC m=+179.793268420" lastFinishedPulling="2026-04-22 18:49:45.347546661 +0000 UTC m=+181.679055100" observedRunningTime="2026-04-22 18:49:45.905948712 +0000 UTC m=+182.237457165" watchObservedRunningTime="2026-04-22 18:49:45.907393529 +0000 UTC m=+182.238901982" Apr 22 18:49:48.857391 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:48.857354 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" podUID="7413bf65-85db-4595-b685-fede8886d53f" containerName="registry" containerID="cri-o://b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b" gracePeriod=30 Apr 22 18:49:49.081786 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.081764 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:49.187390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187303 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-registry-certificates\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187370 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187669 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187403 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7413bf65-85db-4595-b685-fede8886d53f-ca-trust-extracted\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187669 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187427 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-bound-sa-token\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187669 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187469 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-image-registry-private-configuration\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187669 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187507 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-installation-pull-secrets\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187669 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187534 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-trusted-ca\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187669 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187581 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl77g\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-kube-api-access-gl77g\") pod \"7413bf65-85db-4595-b685-fede8886d53f\" (UID: \"7413bf65-85db-4595-b685-fede8886d53f\") " Apr 22 18:49:49.187969 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.187806 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:49.188351 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.188324 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:49.190086 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.190044 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-kube-api-access-gl77g" (OuterVolumeSpecName: "kube-api-access-gl77g") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "kube-api-access-gl77g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:49.190301 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.190275 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:49.190409 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.190275 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:49.190409 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.190320 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:49.190409 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.190339 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:49.195925 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.195901 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7413bf65-85db-4595-b685-fede8886d53f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7413bf65-85db-4595-b685-fede8886d53f" (UID: "7413bf65-85db-4595-b685-fede8886d53f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:49.288871 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288844 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-installation-pull-secrets\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.288871 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288866 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-trusted-ca\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.288871 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288877 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gl77g\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-kube-api-access-gl77g\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.289110 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288886 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7413bf65-85db-4595-b685-fede8886d53f-registry-certificates\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.289110 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288895 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-registry-tls\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.289110 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288904 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7413bf65-85db-4595-b685-fede8886d53f-ca-trust-extracted\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.289110 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288912 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7413bf65-85db-4595-b685-fede8886d53f-bound-sa-token\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.289110 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.288921 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7413bf65-85db-4595-b685-fede8886d53f-image-registry-private-configuration\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.886560 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.885993 2570 generic.go:358] "Generic (PLEG): container finished" podID="7413bf65-85db-4595-b685-fede8886d53f" containerID="b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b" exitCode=0 Apr 22 18:49:49.886560 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.886082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" event={"ID":"7413bf65-85db-4595-b685-fede8886d53f","Type":"ContainerDied","Data":"b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b"} Apr 22 18:49:49.886560 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.886110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" event={"ID":"7413bf65-85db-4595-b685-fede8886d53f","Type":"ContainerDied","Data":"a659d948d958f4b40a33bbc13edaf221698d0cfceb28b54200043afff88120ce"} Apr 22 18:49:49.886560 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.886129 2570 scope.go:117] "RemoveContainer" containerID="b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b" Apr 22 18:49:49.886560 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.886268 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-664bcd8945-zfwms" Apr 22 18:49:49.896641 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.895685 2570 scope.go:117] "RemoveContainer" containerID="b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b" Apr 22 18:49:49.896641 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:49:49.895932 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b\": container with ID starting with b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b not found: ID does not exist" containerID="b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b" Apr 22 18:49:49.896641 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.895963 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b"} err="failed to get container status \"b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b\": rpc error: code = NotFound desc = could not find container \"b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b\": container with ID starting with b373507544f262c7a25ad3fbea06f75d0277f24439e9ab6fc3190e5528ef777b not found: ID does not exist" Apr 22 18:49:49.911652 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.911612 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-664bcd8945-zfwms"] Apr 22 18:49:49.916572 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:49.916549 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-664bcd8945-zfwms"] Apr 22 18:49:50.247922 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:50.247852 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7413bf65-85db-4595-b685-fede8886d53f" path="/var/lib/kubelet/pods/7413bf65-85db-4595-b685-fede8886d53f/volumes" Apr 22 18:49:53.244321 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:53.244290 2570 scope.go:117] "RemoveContainer" containerID="8457aa13aafc0a4a005fb964f934d41e44142d413008155f6e51b861936a8d80" Apr 22 18:49:53.899835 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:53.899811 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 18:49:53.900054 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:53.899875 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" event={"ID":"26e711ad-d134-4b34-9606-2a2cb1b1f283","Type":"ContainerStarted","Data":"5bb7a43f32f09390efb446586dee00b495f77d3ac4757ec537da400e4c546ac5"} Apr 22 18:49:53.900179 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:53.900140 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:49:53.918918 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:53.918876 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" podStartSLOduration=54.7098116 podStartE2EDuration="56.91886472s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="2026-04-22 18:48:57.693462534 +0000 UTC m=+134.024970978" lastFinishedPulling="2026-04-22 18:48:59.902515653 +0000 UTC m=+136.234024098" observedRunningTime="2026-04-22 18:49:53.91693664 +0000 UTC m=+190.248445091" watchObservedRunningTime="2026-04-22 18:49:53.91886472 +0000 UTC m=+190.250373172" Apr 22 18:49:53.980365 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:53.980340 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-nzr8k" Apr 22 18:49:54.159338 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.159254 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7fbkv"] Apr 22 18:49:54.159539 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.159527 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7413bf65-85db-4595-b685-fede8886d53f" containerName="registry" Apr 22 18:49:54.159578 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.159541 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7413bf65-85db-4595-b685-fede8886d53f" containerName="registry" Apr 22 18:49:54.159614 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.159603 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7413bf65-85db-4595-b685-fede8886d53f" containerName="registry" Apr 22 18:49:54.162353 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.162338 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7fbkv" Apr 22 18:49:54.164933 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.164902 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-826cq\"" Apr 22 18:49:54.165054 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.164957 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:49:54.165054 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.164993 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:49:54.171643 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.171617 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7fbkv"] Apr 22 18:49:54.334159 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.334121 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj256\" (UniqueName: \"kubernetes.io/projected/df2750ad-dfcb-4f8d-ad05-8e76b5f74f48-kube-api-access-pj256\") pod \"downloads-6bcc868b7-7fbkv\" (UID: \"df2750ad-dfcb-4f8d-ad05-8e76b5f74f48\") " pod="openshift-console/downloads-6bcc868b7-7fbkv" Apr 22 18:49:54.435080 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.435003 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pj256\" (UniqueName: \"kubernetes.io/projected/df2750ad-dfcb-4f8d-ad05-8e76b5f74f48-kube-api-access-pj256\") pod \"downloads-6bcc868b7-7fbkv\" (UID: \"df2750ad-dfcb-4f8d-ad05-8e76b5f74f48\") " pod="openshift-console/downloads-6bcc868b7-7fbkv" Apr 22 18:49:54.444081 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.444052 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj256\" (UniqueName: \"kubernetes.io/projected/df2750ad-dfcb-4f8d-ad05-8e76b5f74f48-kube-api-access-pj256\") pod \"downloads-6bcc868b7-7fbkv\" (UID: \"df2750ad-dfcb-4f8d-ad05-8e76b5f74f48\") " pod="openshift-console/downloads-6bcc868b7-7fbkv" Apr 22 18:49:54.471942 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.471921 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7fbkv" Apr 22 18:49:54.607808 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.607774 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7fbkv"] Apr 22 18:49:54.613284 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:49:54.613258 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf2750ad_dfcb_4f8d_ad05_8e76b5f74f48.slice/crio-f5d94496c695057065f738ecfcda8d6049ec8635ed8c09493cebbbff1bc8b04b WatchSource:0}: Error finding container f5d94496c695057065f738ecfcda8d6049ec8635ed8c09493cebbbff1bc8b04b: Status 404 returned error can't find the container with id f5d94496c695057065f738ecfcda8d6049ec8635ed8c09493cebbbff1bc8b04b Apr 22 18:49:54.903347 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:49:54.903316 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7fbkv" event={"ID":"df2750ad-dfcb-4f8d-ad05-8e76b5f74f48","Type":"ContainerStarted","Data":"f5d94496c695057065f738ecfcda8d6049ec8635ed8c09493cebbbff1bc8b04b"} Apr 22 18:50:03.332718 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:03.332662 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:50:03.332718 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:03.332723 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:50:05.016656 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.016619 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c88579fc7-rvb5s"] Apr 22 18:50:05.021504 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.021478 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.024110 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.024084 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:50:05.025653 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.025344 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:50:05.025653 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.025502 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:50:05.025653 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.025540 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cnznn\"" Apr 22 18:50:05.025653 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.025569 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:50:05.025653 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.025502 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:50:05.030750 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.030543 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c88579fc7-rvb5s"] Apr 22 18:50:05.126676 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.126639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-serving-cert\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.126865 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.126776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-service-ca\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.126865 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.126811 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-oauth-config\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.126865 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.126838 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-oauth-serving-cert\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.127061 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.126879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-console-config\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.127061 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.126921 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6p8f\" (UniqueName: \"kubernetes.io/projected/79b8f8c1-c817-431b-9c31-3c77631537aa-kube-api-access-z6p8f\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.228007 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.227974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-serving-cert\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.228186 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.228087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-service-ca\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.228186 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.228115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-oauth-config\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.228186 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.228132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-oauth-serving-cert\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.228186 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.228180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-console-config\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.228371 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.228229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6p8f\" (UniqueName: \"kubernetes.io/projected/79b8f8c1-c817-431b-9c31-3c77631537aa-kube-api-access-z6p8f\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.229232 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.229202 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-oauth-serving-cert\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.229361 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.229254 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-console-config\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.229361 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.229346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-service-ca\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.235871 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.235819 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-oauth-config\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.237553 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.237528 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-serving-cert\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.243927 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.243904 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6p8f\" (UniqueName: \"kubernetes.io/projected/79b8f8c1-c817-431b-9c31-3c77631537aa-kube-api-access-z6p8f\") pod \"console-c88579fc7-rvb5s\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:05.334412 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:05.334265 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:07.441783 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:07.441754 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n4k5q_b72ffd58-b0b5-48d0-b56c-8d4e7f30307b/serve-healthcheck-canary/0.log" Apr 22 18:50:09.889728 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:09.889702 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c88579fc7-rvb5s"] Apr 22 18:50:09.912109 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:50:09.912082 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b8f8c1_c817_431b_9c31_3c77631537aa.slice/crio-b409abc682ac57e050765f77e37369899d8e29f718add9bc57b04aabe01a02eb WatchSource:0}: Error finding container b409abc682ac57e050765f77e37369899d8e29f718add9bc57b04aabe01a02eb: Status 404 returned error can't find the container with id b409abc682ac57e050765f77e37369899d8e29f718add9bc57b04aabe01a02eb Apr 22 18:50:09.950842 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:09.950812 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c88579fc7-rvb5s" event={"ID":"79b8f8c1-c817-431b-9c31-3c77631537aa","Type":"ContainerStarted","Data":"b409abc682ac57e050765f77e37369899d8e29f718add9bc57b04aabe01a02eb"} Apr 22 18:50:09.952115 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:09.952095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7fbkv" event={"ID":"df2750ad-dfcb-4f8d-ad05-8e76b5f74f48","Type":"ContainerStarted","Data":"482ab82f49a4d1bc63cbd655903eb4aae6f44dd2ac2c3ce18fbca00a19972cf7"} Apr 22 18:50:09.952335 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:09.952316 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7fbkv" Apr 22 18:50:09.953828 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:09.953798 2570 patch_prober.go:28] interesting pod/downloads-6bcc868b7-7fbkv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.20:8080/\": dial tcp 10.134.0.20:8080: connect: connection refused" start-of-body= Apr 22 18:50:09.953896 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:09.953854 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-7fbkv" podUID="df2750ad-dfcb-4f8d-ad05-8e76b5f74f48" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.20:8080/\": dial tcp 10.134.0.20:8080: connect: connection refused" Apr 22 18:50:09.969788 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:09.969746 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7fbkv" podStartSLOduration=0.718542938 podStartE2EDuration="15.969733725s" podCreationTimestamp="2026-04-22 18:49:54 +0000 UTC" firstStartedPulling="2026-04-22 18:49:54.615126685 +0000 UTC m=+190.946635122" lastFinishedPulling="2026-04-22 18:50:09.866317464 +0000 UTC m=+206.197825909" observedRunningTime="2026-04-22 18:50:09.969087607 +0000 UTC m=+206.300596059" watchObservedRunningTime="2026-04-22 18:50:09.969733725 +0000 UTC m=+206.301242177" Apr 22 18:50:10.972203 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:10.972130 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7fbkv" Apr 22 18:50:13.078741 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.078707 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58f4998689-6vfmc"] Apr 22 18:50:13.105126 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.105095 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f4998689-6vfmc"] Apr 22 18:50:13.105286 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.105251 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.113831 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.113806 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:50:13.204722 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.204696 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsnkv\" (UniqueName: \"kubernetes.io/projected/e51357d8-a933-48ab-b92a-1d4d187453d8-kube-api-access-rsnkv\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.204874 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.204738 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-console-config\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.204874 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.204766 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-serving-cert\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.204874 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.204786 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-service-ca\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.204874 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.204835 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-oauth-config\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.204874 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.204858 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-oauth-serving-cert\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.205129 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.204977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-trusted-ca-bundle\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.305431 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.305402 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-trusted-ca-bundle\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.305579 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.305459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsnkv\" (UniqueName: \"kubernetes.io/projected/e51357d8-a933-48ab-b92a-1d4d187453d8-kube-api-access-rsnkv\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.305579 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.305493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-console-config\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.305579 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.305526 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-serving-cert\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.305579 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.305553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-service-ca\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.305786 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.305593 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-oauth-config\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.305786 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.305627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-oauth-serving-cert\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.306332 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.306306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-console-config\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.306528 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.306387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-service-ca\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.306528 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.306412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-trusted-ca-bundle\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.308324 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.308298 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-serving-cert\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.308416 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.308377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-oauth-config\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.312614 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.312583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-oauth-serving-cert\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.314417 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.314396 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsnkv\" (UniqueName: \"kubernetes.io/projected/e51357d8-a933-48ab-b92a-1d4d187453d8-kube-api-access-rsnkv\") pod \"console-58f4998689-6vfmc\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.416501 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.416461 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:13.673404 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.673323 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f4998689-6vfmc"] Apr 22 18:50:13.676797 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:50:13.676766 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51357d8_a933_48ab_b92a_1d4d187453d8.slice/crio-c561cd6e167aac201a0bb7504cd8a9e1be5d9036271413f5f3017e05cb42d38d WatchSource:0}: Error finding container c561cd6e167aac201a0bb7504cd8a9e1be5d9036271413f5f3017e05cb42d38d: Status 404 returned error can't find the container with id c561cd6e167aac201a0bb7504cd8a9e1be5d9036271413f5f3017e05cb42d38d Apr 22 18:50:13.970330 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.970250 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f4998689-6vfmc" event={"ID":"e51357d8-a933-48ab-b92a-1d4d187453d8","Type":"ContainerStarted","Data":"a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe"} Apr 22 18:50:13.970330 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.970288 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f4998689-6vfmc" event={"ID":"e51357d8-a933-48ab-b92a-1d4d187453d8","Type":"ContainerStarted","Data":"c561cd6e167aac201a0bb7504cd8a9e1be5d9036271413f5f3017e05cb42d38d"} Apr 22 18:50:13.971864 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.971837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c88579fc7-rvb5s" event={"ID":"79b8f8c1-c817-431b-9c31-3c77631537aa","Type":"ContainerStarted","Data":"22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10"} Apr 22 18:50:13.988776 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:13.988732 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58f4998689-6vfmc" podStartSLOduration=0.988714422 podStartE2EDuration="988.714422ms" podCreationTimestamp="2026-04-22 18:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:50:13.987914074 +0000 UTC m=+210.319422527" watchObservedRunningTime="2026-04-22 18:50:13.988714422 +0000 UTC m=+210.320222876" Apr 22 18:50:14.006892 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:14.006850 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c88579fc7-rvb5s" podStartSLOduration=6.392726115 podStartE2EDuration="10.006836028s" podCreationTimestamp="2026-04-22 18:50:04 +0000 UTC" firstStartedPulling="2026-04-22 18:50:09.913993825 +0000 UTC m=+206.245502269" lastFinishedPulling="2026-04-22 18:50:13.528103737 +0000 UTC m=+209.859612182" observedRunningTime="2026-04-22 18:50:14.005774935 +0000 UTC m=+210.337283388" watchObservedRunningTime="2026-04-22 18:50:14.006836028 +0000 UTC m=+210.338344482" Apr 22 18:50:15.335058 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:15.334996 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:15.335058 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:15.335063 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:15.340530 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:15.340503 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:15.982858 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:15.982825 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:23.338433 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:23.338404 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:50:23.342127 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:23.342105 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fc6d96979-n9gdr" Apr 22 18:50:23.417156 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:23.417128 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:23.417303 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:23.417164 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:23.422641 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:23.422621 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:24.017112 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:24.017084 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:50:24.068569 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:24.068538 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c88579fc7-rvb5s"] Apr 22 18:50:26.019617 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:26.019585 2570 generic.go:358] "Generic (PLEG): container finished" podID="fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3" containerID="13cc532866d9a4aa8e51bd96d6d976cf83ba55e7288d10fb3e341d31c5c7747b" exitCode=0 Apr 22 18:50:26.020077 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:26.019663 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" event={"ID":"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3","Type":"ContainerDied","Data":"13cc532866d9a4aa8e51bd96d6d976cf83ba55e7288d10fb3e341d31c5c7747b"} Apr 22 18:50:26.020077 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:26.019976 2570 scope.go:117] "RemoveContainer" containerID="13cc532866d9a4aa8e51bd96d6d976cf83ba55e7288d10fb3e341d31c5c7747b" Apr 22 18:50:27.023629 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:27.023596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bd5pc" event={"ID":"fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3","Type":"ContainerStarted","Data":"20934d158ad27424a99a2c9f130d2505754d088ca9f4630c552cc0758917c128"} Apr 22 18:50:49.095367 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.095308 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c88579fc7-rvb5s" podUID="79b8f8c1-c817-431b-9c31-3c77631537aa" containerName="console" containerID="cri-o://22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10" gracePeriod=15 Apr 22 18:50:49.396732 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.396712 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c88579fc7-rvb5s_79b8f8c1-c817-431b-9c31-3c77631537aa/console/0.log" Apr 22 18:50:49.396848 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.396785 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:49.417557 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.417529 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-serving-cert\") pod \"79b8f8c1-c817-431b-9c31-3c77631537aa\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " Apr 22 18:50:49.417672 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.417563 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-oauth-config\") pod \"79b8f8c1-c817-431b-9c31-3c77631537aa\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " Apr 22 18:50:49.417672 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.417594 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6p8f\" (UniqueName: \"kubernetes.io/projected/79b8f8c1-c817-431b-9c31-3c77631537aa-kube-api-access-z6p8f\") pod \"79b8f8c1-c817-431b-9c31-3c77631537aa\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " Apr 22 18:50:49.417784 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.417715 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-oauth-serving-cert\") pod \"79b8f8c1-c817-431b-9c31-3c77631537aa\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " Apr 22 18:50:49.417784 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.417777 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-console-config\") pod \"79b8f8c1-c817-431b-9c31-3c77631537aa\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " Apr 22 18:50:49.418179 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.417886 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-service-ca\") pod \"79b8f8c1-c817-431b-9c31-3c77631537aa\" (UID: \"79b8f8c1-c817-431b-9c31-3c77631537aa\") " Apr 22 18:50:49.418179 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.418136 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-console-config" (OuterVolumeSpecName: "console-config") pod "79b8f8c1-c817-431b-9c31-3c77631537aa" (UID: "79b8f8c1-c817-431b-9c31-3c77631537aa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:49.418359 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.418243 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-service-ca" (OuterVolumeSpecName: "service-ca") pod "79b8f8c1-c817-431b-9c31-3c77631537aa" (UID: "79b8f8c1-c817-431b-9c31-3c77631537aa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:49.418359 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.418289 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-console-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:50:49.418789 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.418761 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "79b8f8c1-c817-431b-9c31-3c77631537aa" (UID: "79b8f8c1-c817-431b-9c31-3c77631537aa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:49.420320 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.420294 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "79b8f8c1-c817-431b-9c31-3c77631537aa" (UID: "79b8f8c1-c817-431b-9c31-3c77631537aa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:49.420412 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.420333 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "79b8f8c1-c817-431b-9c31-3c77631537aa" (UID: "79b8f8c1-c817-431b-9c31-3c77631537aa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:49.420412 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.420376 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b8f8c1-c817-431b-9c31-3c77631537aa-kube-api-access-z6p8f" (OuterVolumeSpecName: "kube-api-access-z6p8f") pod "79b8f8c1-c817-431b-9c31-3c77631537aa" (UID: "79b8f8c1-c817-431b-9c31-3c77631537aa"). InnerVolumeSpecName "kube-api-access-z6p8f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:49.518864 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.518837 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-serving-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:50:49.518864 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.518862 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/79b8f8c1-c817-431b-9c31-3c77631537aa-console-oauth-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:50:49.518994 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.518877 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6p8f\" (UniqueName: \"kubernetes.io/projected/79b8f8c1-c817-431b-9c31-3c77631537aa-kube-api-access-z6p8f\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:50:49.518994 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.518889 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-oauth-serving-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:50:49.518994 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:49.518903 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79b8f8c1-c817-431b-9c31-3c77631537aa-service-ca\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.090962 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.090936 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c88579fc7-rvb5s_79b8f8c1-c817-431b-9c31-3c77631537aa/console/0.log" Apr 22 18:50:50.091151 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.090977 2570 generic.go:358] "Generic (PLEG): container finished" podID="79b8f8c1-c817-431b-9c31-3c77631537aa" containerID="22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10" exitCode=2 Apr 22 18:50:50.091151 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.091039 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c88579fc7-rvb5s" event={"ID":"79b8f8c1-c817-431b-9c31-3c77631537aa","Type":"ContainerDied","Data":"22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10"} Apr 22 18:50:50.091151 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.091055 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c88579fc7-rvb5s" Apr 22 18:50:50.091151 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.091074 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c88579fc7-rvb5s" event={"ID":"79b8f8c1-c817-431b-9c31-3c77631537aa","Type":"ContainerDied","Data":"b409abc682ac57e050765f77e37369899d8e29f718add9bc57b04aabe01a02eb"} Apr 22 18:50:50.091151 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.091090 2570 scope.go:117] "RemoveContainer" containerID="22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10" Apr 22 18:50:50.099111 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.098927 2570 scope.go:117] "RemoveContainer" containerID="22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10" Apr 22 18:50:50.099349 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:50:50.099215 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10\": container with ID starting with 22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10 not found: ID does not exist" containerID="22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10" Apr 22 18:50:50.099349 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.099254 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10"} err="failed to get container status \"22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10\": rpc error: code = NotFound desc = could not find container \"22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10\": container with ID starting with 22a3d98b04aeba534dcd33dc56ffb80e7c1e871647f6c47dff02d25411212f10 not found: ID does not exist" Apr 22 18:50:50.113913 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.113894 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c88579fc7-rvb5s"] Apr 22 18:50:50.117115 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.117098 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c88579fc7-rvb5s"] Apr 22 18:50:50.248217 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:50.248192 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b8f8c1-c817-431b-9c31-3c77631537aa" path="/var/lib/kubelet/pods/79b8f8c1-c817-431b-9c31-3c77631537aa/volumes" Apr 22 18:50:55.064964 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:55.064863 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:50:55.067165 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:55.067144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/383fe532-b742-451a-8a94-dc5c7fd3fce5-metrics-certs\") pod \"network-metrics-daemon-mf94f\" (UID: \"383fe532-b742-451a-8a94-dc5c7fd3fce5\") " pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:50:55.253186 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:55.253153 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qpsz\"" Apr 22 18:50:55.261376 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:55.261354 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mf94f" Apr 22 18:50:55.383894 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:55.383823 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mf94f"] Apr 22 18:50:55.386523 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:50:55.386498 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod383fe532_b742_451a_8a94_dc5c7fd3fce5.slice/crio-730024f1de9e05b9fe90a5518079c2c790852ddf4c6e6de42034a5925db22031 WatchSource:0}: Error finding container 730024f1de9e05b9fe90a5518079c2c790852ddf4c6e6de42034a5925db22031: Status 404 returned error can't find the container with id 730024f1de9e05b9fe90a5518079c2c790852ddf4c6e6de42034a5925db22031 Apr 22 18:50:56.110783 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:56.110749 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mf94f" event={"ID":"383fe532-b742-451a-8a94-dc5c7fd3fce5","Type":"ContainerStarted","Data":"730024f1de9e05b9fe90a5518079c2c790852ddf4c6e6de42034a5925db22031"} Apr 22 18:50:57.115959 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:57.115928 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mf94f" event={"ID":"383fe532-b742-451a-8a94-dc5c7fd3fce5","Type":"ContainerStarted","Data":"da4d90fc5996358735dbfaaa5ec1c158372a12e929bc9271989218ae08861264"} Apr 22 18:50:57.115959 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:57.115963 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mf94f" event={"ID":"383fe532-b742-451a-8a94-dc5c7fd3fce5","Type":"ContainerStarted","Data":"b63be03b89224eecf565f2b3cdec199360a5fbcda39507e3374a764f34bbff5a"} Apr 22 18:50:57.132875 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:57.132833 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mf94f" podStartSLOduration=251.987122225 podStartE2EDuration="4m13.132820199s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="2026-04-22 18:50:55.388687093 +0000 UTC m=+251.720195538" lastFinishedPulling="2026-04-22 18:50:56.534385082 +0000 UTC m=+252.865893512" observedRunningTime="2026-04-22 18:50:57.131423813 +0000 UTC m=+253.462932268" watchObservedRunningTime="2026-04-22 18:50:57.132820199 +0000 UTC m=+253.464328689" Apr 22 18:50:59.118809 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:59.118775 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:50:59.119562 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:59.119528 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="alertmanager" containerID="cri-o://e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4" gracePeriod=120 Apr 22 18:50:59.119675 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:59.119545 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-metric" containerID="cri-o://4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12" gracePeriod=120 Apr 22 18:50:59.119675 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:59.119578 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="prom-label-proxy" containerID="cri-o://ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70" gracePeriod=120 Apr 22 18:50:59.119675 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:59.119632 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="config-reloader" containerID="cri-o://5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09" gracePeriod=120 Apr 22 18:50:59.119675 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:59.119633 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy" containerID="cri-o://96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03" gracePeriod=120 Apr 22 18:50:59.119870 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:50:59.119592 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-web" containerID="cri-o://9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c" gracePeriod=120 Apr 22 18:51:00.129032 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.128986 2570 generic.go:358] "Generic (PLEG): container finished" podID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerID="ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70" exitCode=0 Apr 22 18:51:00.129032 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.129033 2570 generic.go:358] "Generic (PLEG): container finished" podID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerID="96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03" exitCode=0 Apr 22 18:51:00.129444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.129042 2570 generic.go:358] "Generic (PLEG): container finished" podID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerID="5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09" exitCode=0 Apr 22 18:51:00.129444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.129050 2570 generic.go:358] "Generic (PLEG): container finished" podID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerID="e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4" exitCode=0 Apr 22 18:51:00.129444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.129045 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70"} Apr 22 18:51:00.129444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.129140 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03"} Apr 22 18:51:00.129444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.129150 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09"} Apr 22 18:51:00.129444 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.129160 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4"} Apr 22 18:51:00.287940 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.287713 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59df7f7578-8hvcm"] Apr 22 18:51:00.288223 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.288198 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79b8f8c1-c817-431b-9c31-3c77631537aa" containerName="console" Apr 22 18:51:00.288402 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.288387 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b8f8c1-c817-431b-9c31-3c77631537aa" containerName="console" Apr 22 18:51:00.288631 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.288614 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="79b8f8c1-c817-431b-9c31-3c77631537aa" containerName="console" Apr 22 18:51:00.291967 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.291944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.299943 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.299901 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59df7f7578-8hvcm"] Apr 22 18:51:00.370279 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.370251 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:00.414944 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.414877 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-main-db\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.414944 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.414913 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-trusted-ca-bundle\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.414944 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.414934 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-out\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.414957 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-web\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.414979 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-main-tls\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415051 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-tls-assets\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415091 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415131 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsrvp\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-kube-api-access-nsrvp\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415171 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-metrics-client-ca\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415227 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415246 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-web-config\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415273 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-cluster-tls-config\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415298 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-volume\") pod \"d39edff9-1df2-4987-9004-e7da2c37c7eb\" (UID: \"d39edff9-1df2-4987-9004-e7da2c37c7eb\") " Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415335 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415395 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmcl\" (UniqueName: \"kubernetes.io/projected/42bba0ec-b46b-495c-ab02-c387ba33a97a-kube-api-access-2wmcl\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-config\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415457 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-oauth-serving-cert\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415472 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415497 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-service-ca\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415520 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-trusted-ca-bundle\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.415625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415612 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-serving-cert\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.416218 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-oauth-config\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.416218 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415764 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-main-db\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.416218 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.415782 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.416799 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.416771 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:00.419586 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.419482 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:00.419586 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.419552 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.419743 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.419668 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-volume" (OuterVolumeSpecName: "config-volume") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.419743 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.419703 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-kube-api-access-nsrvp" (OuterVolumeSpecName: "kube-api-access-nsrvp") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "kube-api-access-nsrvp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:00.420763 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.420116 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.420763 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.420577 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.420763 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.420521 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.421181 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.421155 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-out" (OuterVolumeSpecName: "config-out") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:00.427270 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.427235 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.435251 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.435226 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-web-config" (OuterVolumeSpecName: "web-config") pod "d39edff9-1df2-4987-9004-e7da2c37c7eb" (UID: "d39edff9-1df2-4987-9004-e7da2c37c7eb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:00.516375 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516345 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-serving-cert\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.516558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-oauth-config\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.516558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmcl\" (UniqueName: \"kubernetes.io/projected/42bba0ec-b46b-495c-ab02-c387ba33a97a-kube-api-access-2wmcl\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.516558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-config\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.516558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-oauth-serving-cert\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.516558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-service-ca\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.516558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516546 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-trusted-ca-bundle\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516594 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516611 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-web-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516627 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-cluster-tls-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516641 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-volume\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516656 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d39edff9-1df2-4987-9004-e7da2c37c7eb-config-out\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516670 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516684 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-main-tls\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516697 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-tls-assets\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516711 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d39edff9-1df2-4987-9004-e7da2c37c7eb-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516729 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsrvp\" (UniqueName: \"kubernetes.io/projected/d39edff9-1df2-4987-9004-e7da2c37c7eb-kube-api-access-nsrvp\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.516869 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.516744 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d39edff9-1df2-4987-9004-e7da2c37c7eb-metrics-client-ca\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:00.517341 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.517318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-oauth-serving-cert\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.517381 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.517318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-config\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.517556 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.517532 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-trusted-ca-bundle\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.517683 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.517666 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-service-ca\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.518655 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.518633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-serving-cert\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.518843 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.518822 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-oauth-config\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.525824 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.525804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmcl\" (UniqueName: \"kubernetes.io/projected/42bba0ec-b46b-495c-ab02-c387ba33a97a-kube-api-access-2wmcl\") pod \"console-59df7f7578-8hvcm\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.605203 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.605178 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:00.729715 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:00.729691 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59df7f7578-8hvcm"] Apr 22 18:51:00.731913 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:51:00.731874 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42bba0ec_b46b_495c_ab02_c387ba33a97a.slice/crio-6a1f4a8a025d1b11e1a308b14e3a06bb587cd68e256b6b95321d8cf0b267cf0a WatchSource:0}: Error finding container 6a1f4a8a025d1b11e1a308b14e3a06bb587cd68e256b6b95321d8cf0b267cf0a: Status 404 returned error can't find the container with id 6a1f4a8a025d1b11e1a308b14e3a06bb587cd68e256b6b95321d8cf0b267cf0a Apr 22 18:51:01.133613 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.133583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59df7f7578-8hvcm" event={"ID":"42bba0ec-b46b-495c-ab02-c387ba33a97a","Type":"ContainerStarted","Data":"af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690"} Apr 22 18:51:01.134079 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.133623 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59df7f7578-8hvcm" event={"ID":"42bba0ec-b46b-495c-ab02-c387ba33a97a","Type":"ContainerStarted","Data":"6a1f4a8a025d1b11e1a308b14e3a06bb587cd68e256b6b95321d8cf0b267cf0a"} Apr 22 18:51:01.136344 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.136320 2570 generic.go:358] "Generic (PLEG): container finished" podID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerID="4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12" exitCode=0 Apr 22 18:51:01.136344 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.136343 2570 generic.go:358] "Generic (PLEG): container finished" podID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerID="9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c" exitCode=0 Apr 22 18:51:01.136484 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.136404 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12"} Apr 22 18:51:01.136484 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.136426 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.136484 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.136437 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c"} Apr 22 18:51:01.136484 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.136449 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d39edff9-1df2-4987-9004-e7da2c37c7eb","Type":"ContainerDied","Data":"dbd5ae042abf4a51ff6a8c0d8312817be2f3180dbd5db5f6951147ff74b09a79"} Apr 22 18:51:01.136484 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.136464 2570 scope.go:117] "RemoveContainer" containerID="ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70" Apr 22 18:51:01.143680 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.143665 2570 scope.go:117] "RemoveContainer" containerID="4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12" Apr 22 18:51:01.150097 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.150079 2570 scope.go:117] "RemoveContainer" containerID="96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03" Apr 22 18:51:01.155483 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.155435 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59df7f7578-8hvcm" podStartSLOduration=1.155420498 podStartE2EDuration="1.155420498s" podCreationTimestamp="2026-04-22 18:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:01.150953266 +0000 UTC m=+257.482461720" watchObservedRunningTime="2026-04-22 18:51:01.155420498 +0000 UTC m=+257.486928952" Apr 22 18:51:01.161868 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.161850 2570 scope.go:117] "RemoveContainer" containerID="9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c" Apr 22 18:51:01.168407 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.168390 2570 scope.go:117] "RemoveContainer" containerID="5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09" Apr 22 18:51:01.172496 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.172476 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:51:01.175279 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.175263 2570 scope.go:117] "RemoveContainer" containerID="e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4" Apr 22 18:51:01.179500 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.179479 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:51:01.181829 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.181814 2570 scope.go:117] "RemoveContainer" containerID="be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e" Apr 22 18:51:01.188135 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188112 2570 scope.go:117] "RemoveContainer" containerID="ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70" Apr 22 18:51:01.188357 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:01.188338 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70\": container with ID starting with ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70 not found: ID does not exist" containerID="ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70" Apr 22 18:51:01.188403 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188364 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70"} err="failed to get container status \"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70\": rpc error: code = NotFound desc = could not find container \"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70\": container with ID starting with ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70 not found: ID does not exist" Apr 22 18:51:01.188403 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188379 2570 scope.go:117] "RemoveContainer" containerID="4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12" Apr 22 18:51:01.188581 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:01.188567 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12\": container with ID starting with 4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12 not found: ID does not exist" containerID="4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12" Apr 22 18:51:01.188625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188584 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12"} err="failed to get container status \"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12\": rpc error: code = NotFound desc = could not find container \"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12\": container with ID starting with 4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12 not found: ID does not exist" Apr 22 18:51:01.188625 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188596 2570 scope.go:117] "RemoveContainer" containerID="96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03" Apr 22 18:51:01.188783 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:01.188769 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03\": container with ID starting with 96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03 not found: ID does not exist" containerID="96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03" Apr 22 18:51:01.188827 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188786 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03"} err="failed to get container status \"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03\": rpc error: code = NotFound desc = could not find container \"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03\": container with ID starting with 96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03 not found: ID does not exist" Apr 22 18:51:01.188827 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188797 2570 scope.go:117] "RemoveContainer" containerID="9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c" Apr 22 18:51:01.188968 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:01.188954 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c\": container with ID starting with 9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c not found: ID does not exist" containerID="9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c" Apr 22 18:51:01.189022 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188972 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c"} err="failed to get container status \"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c\": rpc error: code = NotFound desc = could not find container \"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c\": container with ID starting with 9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c not found: ID does not exist" Apr 22 18:51:01.189022 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.188983 2570 scope.go:117] "RemoveContainer" containerID="5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09" Apr 22 18:51:01.189235 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:01.189218 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09\": container with ID starting with 5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09 not found: ID does not exist" containerID="5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09" Apr 22 18:51:01.189294 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189239 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09"} err="failed to get container status \"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09\": rpc error: code = NotFound desc = could not find container \"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09\": container with ID starting with 5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09 not found: ID does not exist" Apr 22 18:51:01.189294 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189251 2570 scope.go:117] "RemoveContainer" containerID="e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4" Apr 22 18:51:01.189444 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:01.189430 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4\": container with ID starting with e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4 not found: ID does not exist" containerID="e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4" Apr 22 18:51:01.189482 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189447 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4"} err="failed to get container status \"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4\": rpc error: code = NotFound desc = could not find container \"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4\": container with ID starting with e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4 not found: ID does not exist" Apr 22 18:51:01.189482 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189458 2570 scope.go:117] "RemoveContainer" containerID="be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e" Apr 22 18:51:01.189674 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:01.189657 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e\": container with ID starting with be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e not found: ID does not exist" containerID="be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e" Apr 22 18:51:01.189715 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189679 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e"} err="failed to get container status \"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e\": rpc error: code = NotFound desc = could not find container \"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e\": container with ID starting with be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e not found: ID does not exist" Apr 22 18:51:01.189715 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189695 2570 scope.go:117] "RemoveContainer" containerID="ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70" Apr 22 18:51:01.189930 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189913 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70"} err="failed to get container status \"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70\": rpc error: code = NotFound desc = could not find container \"ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70\": container with ID starting with ae1d89c896e17669defd74caeebe3b78126036cf9e3dea49a8644f0097ab1d70 not found: ID does not exist" Apr 22 18:51:01.189974 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.189930 2570 scope.go:117] "RemoveContainer" containerID="4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12" Apr 22 18:51:01.190189 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190173 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12"} err="failed to get container status \"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12\": rpc error: code = NotFound desc = could not find container \"4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12\": container with ID starting with 4602d3d7f3e990e3524f4bcc82986e18058b7e0af624e4ef66f4bb28922e2c12 not found: ID does not exist" Apr 22 18:51:01.190245 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190190 2570 scope.go:117] "RemoveContainer" containerID="96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03" Apr 22 18:51:01.190429 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190411 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03"} err="failed to get container status \"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03\": rpc error: code = NotFound desc = could not find container \"96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03\": container with ID starting with 96e8d426c91e96a9680e5ef3c366885fad814eab69aa998b7a14356e0e917d03 not found: ID does not exist" Apr 22 18:51:01.190481 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190431 2570 scope.go:117] "RemoveContainer" containerID="9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c" Apr 22 18:51:01.190620 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190602 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c"} err="failed to get container status \"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c\": rpc error: code = NotFound desc = could not find container \"9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c\": container with ID starting with 9da50cd00cf7f6c1f99c5b7c8717d90eabdca9171033feee6b108cc345a1fd6c not found: ID does not exist" Apr 22 18:51:01.190655 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190622 2570 scope.go:117] "RemoveContainer" containerID="5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09" Apr 22 18:51:01.190827 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190811 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09"} err="failed to get container status \"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09\": rpc error: code = NotFound desc = could not find container \"5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09\": container with ID starting with 5d7d7cd1734edeb268ebfbbb6388b34f9af2398d291b996a8e708542eb469a09 not found: ID does not exist" Apr 22 18:51:01.190880 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.190827 2570 scope.go:117] "RemoveContainer" containerID="e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4" Apr 22 18:51:01.191039 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.191004 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4"} err="failed to get container status \"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4\": rpc error: code = NotFound desc = could not find container \"e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4\": container with ID starting with e9c26df027971f12324e17d84f242752323ceffd102f307011f176c1182f10f4 not found: ID does not exist" Apr 22 18:51:01.191089 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.191040 2570 scope.go:117] "RemoveContainer" containerID="be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e" Apr 22 18:51:01.191254 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.191232 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e"} err="failed to get container status \"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e\": rpc error: code = NotFound desc = could not find container \"be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e\": container with ID starting with be98b82920ad20f5a775fd478074b2783c5df083017ecd583506248846b1ee7e not found: ID does not exist" Apr 22 18:51:01.205800 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.205779 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206209 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-metric" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206230 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-metric" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206245 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="init-config-reloader" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206254 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="init-config-reloader" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206266 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="prom-label-proxy" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206274 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="prom-label-proxy" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206286 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-web" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206294 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-web" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206312 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="config-reloader" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206320 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="config-reloader" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206331 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="alertmanager" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206339 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="alertmanager" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206348 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206356 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206423 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206434 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-metric" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206443 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="prom-label-proxy" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206453 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="alertmanager" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206462 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="kube-rbac-proxy-web" Apr 22 18:51:01.206489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.206472 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" containerName="config-reloader" Apr 22 18:51:01.213877 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.212154 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.215316 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.215562 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.216341 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.216948 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.217178 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.217447 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.217614 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:51:01.219996 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.218119 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:51:01.220431 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.220082 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dnr65\"" Apr 22 18:51:01.232104 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.227411 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:51:01.233232 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.233212 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:51:01.328667 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7m4b\" (UniqueName: \"kubernetes.io/projected/6ba36426-4392-43bf-b52e-099fcad1b911-kube-api-access-w7m4b\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.328815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328672 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ba36426-4392-43bf-b52e-099fcad1b911-config-out\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.328815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.328815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6ba36426-4392-43bf-b52e-099fcad1b911-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.328949 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328826 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-config-volume\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.328949 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.328949 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ba36426-4392-43bf-b52e-099fcad1b911-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.328949 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-web-config\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.329150 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba36426-4392-43bf-b52e-099fcad1b911-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.329150 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.328979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.329150 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.329002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.329150 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.329058 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.329150 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.329096 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ba36426-4392-43bf-b52e-099fcad1b911-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430148 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba36426-4392-43bf-b52e-099fcad1b911-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430148 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430332 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430158 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430332 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430332 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ba36426-4392-43bf-b52e-099fcad1b911-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430332 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7m4b\" (UniqueName: \"kubernetes.io/projected/6ba36426-4392-43bf-b52e-099fcad1b911-kube-api-access-w7m4b\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430332 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ba36426-4392-43bf-b52e-099fcad1b911-config-out\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430705 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430676 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6ba36426-4392-43bf-b52e-099fcad1b911-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.430915 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-config-volume\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.431069 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.431051 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.431176 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.431158 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ba36426-4392-43bf-b52e-099fcad1b911-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.431235 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.431198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-web-config\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433301 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433275 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ba36426-4392-43bf-b52e-099fcad1b911-config-out\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433390 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.431062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba36426-4392-43bf-b52e-099fcad1b911-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433456 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433393 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433456 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433456 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.430939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6ba36426-4392-43bf-b52e-099fcad1b911-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433589 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ba36426-4392-43bf-b52e-099fcad1b911-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433589 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433574 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-config-volume\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433920 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ba36426-4392-43bf-b52e-099fcad1b911-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433998 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433924 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.433998 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.433961 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-web-config\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.434241 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.434222 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.435526 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.435507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ba36426-4392-43bf-b52e-099fcad1b911-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.438851 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.438832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7m4b\" (UniqueName: \"kubernetes.io/projected/6ba36426-4392-43bf-b52e-099fcad1b911-kube-api-access-w7m4b\") pod \"alertmanager-main-0\" (UID: \"6ba36426-4392-43bf-b52e-099fcad1b911\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.538665 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.538642 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:51:01.662956 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:01.662695 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:51:01.665037 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:51:01.664996 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba36426_4392_43bf_b52e_099fcad1b911.slice/crio-d0f07b2e72f65b3566c97aca7175eecc7dc38ec58e572b90d8f493f2833325df WatchSource:0}: Error finding container d0f07b2e72f65b3566c97aca7175eecc7dc38ec58e572b90d8f493f2833325df: Status 404 returned error can't find the container with id d0f07b2e72f65b3566c97aca7175eecc7dc38ec58e572b90d8f493f2833325df Apr 22 18:51:02.140783 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:02.140752 2570 generic.go:358] "Generic (PLEG): container finished" podID="6ba36426-4392-43bf-b52e-099fcad1b911" containerID="ddad2ed072e8a99210de69524d797a53e520a1a33e06f406be5b20bbf3dedfc3" exitCode=0 Apr 22 18:51:02.141150 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:02.140834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerDied","Data":"ddad2ed072e8a99210de69524d797a53e520a1a33e06f406be5b20bbf3dedfc3"} Apr 22 18:51:02.141150 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:02.140866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerStarted","Data":"d0f07b2e72f65b3566c97aca7175eecc7dc38ec58e572b90d8f493f2833325df"} Apr 22 18:51:02.248217 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:02.248184 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39edff9-1df2-4987-9004-e7da2c37c7eb" path="/var/lib/kubelet/pods/d39edff9-1df2-4987-9004-e7da2c37c7eb/volumes" Apr 22 18:51:03.148025 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:03.147973 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerStarted","Data":"f63ef86c5426aaf564796eb39d59946d45de70ad462c99e3338b85a428be88e2"} Apr 22 18:51:03.148025 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:03.148010 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerStarted","Data":"ceff437560ad8cce66261e4d7bea4507c3e851bbbb0e4a0ed6dec85c8d4de266"} Apr 22 18:51:03.148025 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:03.148035 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerStarted","Data":"21eba369b824a73cb40d6b660b1510c4c1ed33e7a5c7c1a651cabd1b3ccf1399"} Apr 22 18:51:03.148600 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:03.148044 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerStarted","Data":"073fac4b790d56c21a0b65f7628e9af2394e7e66c67e59e65e1a8b466dbbc09e"} Apr 22 18:51:03.148600 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:03.148054 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerStarted","Data":"9ac3433845bc202898d70a4e90b274ac9ecbd12cf67316e6d11c16471a02e1f4"} Apr 22 18:51:03.148600 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:03.148062 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6ba36426-4392-43bf-b52e-099fcad1b911","Type":"ContainerStarted","Data":"a25b13f599380fb23fac9d526133736ee85c5684ccad3082d26b997236285b62"} Apr 22 18:51:03.182058 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:03.181989 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.181973378 podStartE2EDuration="2.181973378s" podCreationTimestamp="2026-04-22 18:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:03.179708303 +0000 UTC m=+259.511216791" watchObservedRunningTime="2026-04-22 18:51:03.181973378 +0000 UTC m=+259.513481830" Apr 22 18:51:10.606280 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:10.606241 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:10.606280 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:10.606278 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:10.611061 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:10.611035 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:11.176818 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:11.176790 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:51:11.221317 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:11.221290 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58f4998689-6vfmc"] Apr 22 18:51:36.242603 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.242535 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58f4998689-6vfmc" podUID="e51357d8-a933-48ab-b92a-1d4d187453d8" containerName="console" containerID="cri-o://a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe" gracePeriod=15 Apr 22 18:51:36.474359 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.474337 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58f4998689-6vfmc_e51357d8-a933-48ab-b92a-1d4d187453d8/console/0.log" Apr 22 18:51:36.474465 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.474398 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:51:36.508489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508416 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-oauth-serving-cert\") pod \"e51357d8-a933-48ab-b92a-1d4d187453d8\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " Apr 22 18:51:36.508489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508455 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-oauth-config\") pod \"e51357d8-a933-48ab-b92a-1d4d187453d8\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " Apr 22 18:51:36.508489 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508481 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-serving-cert\") pod \"e51357d8-a933-48ab-b92a-1d4d187453d8\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " Apr 22 18:51:36.508747 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508520 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-service-ca\") pod \"e51357d8-a933-48ab-b92a-1d4d187453d8\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " Apr 22 18:51:36.508747 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508549 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsnkv\" (UniqueName: \"kubernetes.io/projected/e51357d8-a933-48ab-b92a-1d4d187453d8-kube-api-access-rsnkv\") pod \"e51357d8-a933-48ab-b92a-1d4d187453d8\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " Apr 22 18:51:36.508747 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508684 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-trusted-ca-bundle\") pod \"e51357d8-a933-48ab-b92a-1d4d187453d8\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " Apr 22 18:51:36.508908 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508748 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-console-config\") pod \"e51357d8-a933-48ab-b92a-1d4d187453d8\" (UID: \"e51357d8-a933-48ab-b92a-1d4d187453d8\") " Apr 22 18:51:36.508972 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508913 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e51357d8-a933-48ab-b92a-1d4d187453d8" (UID: "e51357d8-a933-48ab-b92a-1d4d187453d8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:36.509060 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.508986 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-service-ca" (OuterVolumeSpecName: "service-ca") pod "e51357d8-a933-48ab-b92a-1d4d187453d8" (UID: "e51357d8-a933-48ab-b92a-1d4d187453d8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:36.509124 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.509083 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-oauth-serving-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:36.509265 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.509236 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-console-config" (OuterVolumeSpecName: "console-config") pod "e51357d8-a933-48ab-b92a-1d4d187453d8" (UID: "e51357d8-a933-48ab-b92a-1d4d187453d8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:36.509385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.509276 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e51357d8-a933-48ab-b92a-1d4d187453d8" (UID: "e51357d8-a933-48ab-b92a-1d4d187453d8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:36.510725 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.510694 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e51357d8-a933-48ab-b92a-1d4d187453d8" (UID: "e51357d8-a933-48ab-b92a-1d4d187453d8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:36.511178 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.511148 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e51357d8-a933-48ab-b92a-1d4d187453d8" (UID: "e51357d8-a933-48ab-b92a-1d4d187453d8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:36.511178 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.511154 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51357d8-a933-48ab-b92a-1d4d187453d8-kube-api-access-rsnkv" (OuterVolumeSpecName: "kube-api-access-rsnkv") pod "e51357d8-a933-48ab-b92a-1d4d187453d8" (UID: "e51357d8-a933-48ab-b92a-1d4d187453d8"). InnerVolumeSpecName "kube-api-access-rsnkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:36.609920 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.609881 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-trusted-ca-bundle\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:36.609920 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.609913 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-console-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:36.609920 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.609923 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-oauth-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:36.610177 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.609932 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e51357d8-a933-48ab-b92a-1d4d187453d8-console-serving-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:36.610177 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.609944 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e51357d8-a933-48ab-b92a-1d4d187453d8-service-ca\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:36.610177 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:36.609954 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rsnkv\" (UniqueName: \"kubernetes.io/projected/e51357d8-a933-48ab-b92a-1d4d187453d8-kube-api-access-rsnkv\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:51:37.251264 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.251236 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58f4998689-6vfmc_e51357d8-a933-48ab-b92a-1d4d187453d8/console/0.log" Apr 22 18:51:37.251692 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.251276 2570 generic.go:358] "Generic (PLEG): container finished" podID="e51357d8-a933-48ab-b92a-1d4d187453d8" containerID="a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe" exitCode=2 Apr 22 18:51:37.251692 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.251338 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f4998689-6vfmc" Apr 22 18:51:37.251692 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.251371 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f4998689-6vfmc" event={"ID":"e51357d8-a933-48ab-b92a-1d4d187453d8","Type":"ContainerDied","Data":"a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe"} Apr 22 18:51:37.251692 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.251418 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f4998689-6vfmc" event={"ID":"e51357d8-a933-48ab-b92a-1d4d187453d8","Type":"ContainerDied","Data":"c561cd6e167aac201a0bb7504cd8a9e1be5d9036271413f5f3017e05cb42d38d"} Apr 22 18:51:37.251692 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.251439 2570 scope.go:117] "RemoveContainer" containerID="a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe" Apr 22 18:51:37.260153 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.260136 2570 scope.go:117] "RemoveContainer" containerID="a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe" Apr 22 18:51:37.260419 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:51:37.260400 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe\": container with ID starting with a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe not found: ID does not exist" containerID="a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe" Apr 22 18:51:37.260468 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.260428 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe"} err="failed to get container status \"a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe\": rpc error: code = NotFound desc = could not find container \"a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe\": container with ID starting with a88d28dc05d9af7da8279f09a1d610d7c60af92d120f22fb1192e0ee7b2baefe not found: ID does not exist" Apr 22 18:51:37.273909 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.273885 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58f4998689-6vfmc"] Apr 22 18:51:37.277288 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.277266 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58f4998689-6vfmc"] Apr 22 18:51:37.722385 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.722347 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2l626"] Apr 22 18:51:37.722643 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.722632 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e51357d8-a933-48ab-b92a-1d4d187453d8" containerName="console" Apr 22 18:51:37.722692 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.722645 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51357d8-a933-48ab-b92a-1d4d187453d8" containerName="console" Apr 22 18:51:37.722727 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.722722 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e51357d8-a933-48ab-b92a-1d4d187453d8" containerName="console" Apr 22 18:51:37.727026 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.726995 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.729275 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.729258 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:51:37.733109 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.733091 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2l626"] Apr 22 18:51:37.819105 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.819067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/758c6e79-f76a-46e0-b353-55929fcd68c4-kubelet-config\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.819272 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.819181 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/758c6e79-f76a-46e0-b353-55929fcd68c4-dbus\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.819272 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.819215 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/758c6e79-f76a-46e0-b353-55929fcd68c4-original-pull-secret\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.919896 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.919864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/758c6e79-f76a-46e0-b353-55929fcd68c4-dbus\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.920099 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.919904 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/758c6e79-f76a-46e0-b353-55929fcd68c4-original-pull-secret\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.920099 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.919959 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/758c6e79-f76a-46e0-b353-55929fcd68c4-kubelet-config\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.920099 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.920067 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/758c6e79-f76a-46e0-b353-55929fcd68c4-dbus\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.920099 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.920085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/758c6e79-f76a-46e0-b353-55929fcd68c4-kubelet-config\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:37.922179 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:37.922162 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/758c6e79-f76a-46e0-b353-55929fcd68c4-original-pull-secret\") pod \"global-pull-secret-syncer-2l626\" (UID: \"758c6e79-f76a-46e0-b353-55929fcd68c4\") " pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:38.037139 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:38.037052 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2l626" Apr 22 18:51:38.153278 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:38.153245 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2l626"] Apr 22 18:51:38.156146 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:51:38.156110 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758c6e79_f76a_46e0_b353_55929fcd68c4.slice/crio-ea11e21a3bb2f8ab20d5b3d6eefefafe696e1bcdcca751e457e882428cc420b8 WatchSource:0}: Error finding container ea11e21a3bb2f8ab20d5b3d6eefefafe696e1bcdcca751e457e882428cc420b8: Status 404 returned error can't find the container with id ea11e21a3bb2f8ab20d5b3d6eefefafe696e1bcdcca751e457e882428cc420b8 Apr 22 18:51:38.247919 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:38.247884 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51357d8-a933-48ab-b92a-1d4d187453d8" path="/var/lib/kubelet/pods/e51357d8-a933-48ab-b92a-1d4d187453d8/volumes" Apr 22 18:51:38.254972 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:38.254937 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2l626" event={"ID":"758c6e79-f76a-46e0-b353-55929fcd68c4","Type":"ContainerStarted","Data":"ea11e21a3bb2f8ab20d5b3d6eefefafe696e1bcdcca751e457e882428cc420b8"} Apr 22 18:51:42.271435 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:42.271332 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2l626" event={"ID":"758c6e79-f76a-46e0-b353-55929fcd68c4","Type":"ContainerStarted","Data":"1c13de7d53778387236004188b0c50f85b18636c3cfc0ce60db519dcea05a46a"} Apr 22 18:51:42.287513 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:42.287466 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2l626" podStartSLOduration=1.43669267 podStartE2EDuration="5.287451706s" podCreationTimestamp="2026-04-22 18:51:37 +0000 UTC" firstStartedPulling="2026-04-22 18:51:38.15764657 +0000 UTC m=+294.489155000" lastFinishedPulling="2026-04-22 18:51:42.008405592 +0000 UTC m=+298.339914036" observedRunningTime="2026-04-22 18:51:42.286352833 +0000 UTC m=+298.617861287" watchObservedRunningTime="2026-04-22 18:51:42.287451706 +0000 UTC m=+298.618960157" Apr 22 18:51:44.118581 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:44.118552 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 18:51:44.120370 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:44.120346 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 18:51:44.124920 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:44.124703 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 18:51:44.127265 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:44.127245 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 18:51:44.128817 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:44.128795 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:51:56.542982 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.542947 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj"] Apr 22 18:51:56.546330 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.546308 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.548750 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.548727 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:51:56.548858 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.548727 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-h26j4\"" Apr 22 18:51:56.549849 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.549831 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:51:56.553967 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.553944 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj"] Apr 22 18:51:56.671380 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.671341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xnv\" (UniqueName: \"kubernetes.io/projected/2086c9c4-a688-4bc7-b802-54b5e27d40e4-kube-api-access-k4xnv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.671583 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.671410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.671583 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.671442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.772483 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.772440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xnv\" (UniqueName: \"kubernetes.io/projected/2086c9c4-a688-4bc7-b802-54b5e27d40e4-kube-api-access-k4xnv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.772659 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.772503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.772659 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.772533 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.772890 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.772867 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.772968 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.772949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.781583 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.781557 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xnv\" (UniqueName: \"kubernetes.io/projected/2086c9c4-a688-4bc7-b802-54b5e27d40e4-kube-api-access-k4xnv\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.856083 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.856051 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:51:56.975948 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.975925 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj"] Apr 22 18:51:56.978427 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:51:56.978396 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2086c9c4_a688_4bc7_b802_54b5e27d40e4.slice/crio-7501306fd833f8c4ee4f0e07f25be87a2ee87a33c726c4faf0f32c72aff78081 WatchSource:0}: Error finding container 7501306fd833f8c4ee4f0e07f25be87a2ee87a33c726c4faf0f32c72aff78081: Status 404 returned error can't find the container with id 7501306fd833f8c4ee4f0e07f25be87a2ee87a33c726c4faf0f32c72aff78081 Apr 22 18:51:56.980250 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:56.980233 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:51:57.315273 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:51:57.315187 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" event={"ID":"2086c9c4-a688-4bc7-b802-54b5e27d40e4","Type":"ContainerStarted","Data":"7501306fd833f8c4ee4f0e07f25be87a2ee87a33c726c4faf0f32c72aff78081"} Apr 22 18:52:02.332583 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:02.332547 2570 generic.go:358] "Generic (PLEG): container finished" podID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerID="f37827e3a58c77f53328ec782a4d0032cc5108ea0ef84f6c3695515c3cb88f62" exitCode=0 Apr 22 18:52:02.333077 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:02.332597 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" event={"ID":"2086c9c4-a688-4bc7-b802-54b5e27d40e4","Type":"ContainerDied","Data":"f37827e3a58c77f53328ec782a4d0032cc5108ea0ef84f6c3695515c3cb88f62"} Apr 22 18:52:05.343502 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:05.343419 2570 generic.go:358] "Generic (PLEG): container finished" podID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerID="1e8e5b69309e1384215330e4883c69df49bc308479fe7fd6df05ff7441c3ad7a" exitCode=0 Apr 22 18:52:05.343502 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:05.343483 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" event={"ID":"2086c9c4-a688-4bc7-b802-54b5e27d40e4","Type":"ContainerDied","Data":"1e8e5b69309e1384215330e4883c69df49bc308479fe7fd6df05ff7441c3ad7a"} Apr 22 18:52:13.370079 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:13.370043 2570 generic.go:358] "Generic (PLEG): container finished" podID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerID="88b115059ed4b1580696cc41360ea2cb35baeb82e2b1a2bfce2823a063e0b1d3" exitCode=0 Apr 22 18:52:13.370469 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:13.370125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" event={"ID":"2086c9c4-a688-4bc7-b802-54b5e27d40e4","Type":"ContainerDied","Data":"88b115059ed4b1580696cc41360ea2cb35baeb82e2b1a2bfce2823a063e0b1d3"} Apr 22 18:52:14.491485 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.491457 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:52:14.631880 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.631782 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4xnv\" (UniqueName: \"kubernetes.io/projected/2086c9c4-a688-4bc7-b802-54b5e27d40e4-kube-api-access-k4xnv\") pod \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " Apr 22 18:52:14.631880 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.631870 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-bundle\") pod \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " Apr 22 18:52:14.632154 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.631893 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-util\") pod \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\" (UID: \"2086c9c4-a688-4bc7-b802-54b5e27d40e4\") " Apr 22 18:52:14.632506 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.632483 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-bundle" (OuterVolumeSpecName: "bundle") pod "2086c9c4-a688-4bc7-b802-54b5e27d40e4" (UID: "2086c9c4-a688-4bc7-b802-54b5e27d40e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:14.634046 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.634028 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2086c9c4-a688-4bc7-b802-54b5e27d40e4-kube-api-access-k4xnv" (OuterVolumeSpecName: "kube-api-access-k4xnv") pod "2086c9c4-a688-4bc7-b802-54b5e27d40e4" (UID: "2086c9c4-a688-4bc7-b802-54b5e27d40e4"). InnerVolumeSpecName "kube-api-access-k4xnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:14.635829 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.635804 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-util" (OuterVolumeSpecName: "util") pod "2086c9c4-a688-4bc7-b802-54b5e27d40e4" (UID: "2086c9c4-a688-4bc7-b802-54b5e27d40e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:14.732646 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.732590 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4xnv\" (UniqueName: \"kubernetes.io/projected/2086c9c4-a688-4bc7-b802-54b5e27d40e4-kube-api-access-k4xnv\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:52:14.732646 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.732638 2570 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-bundle\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:52:14.732646 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:14.732655 2570 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2086c9c4-a688-4bc7-b802-54b5e27d40e4-util\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:52:15.377399 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:15.377372 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" Apr 22 18:52:15.377399 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:15.377377 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cs67bj" event={"ID":"2086c9c4-a688-4bc7-b802-54b5e27d40e4","Type":"ContainerDied","Data":"7501306fd833f8c4ee4f0e07f25be87a2ee87a33c726c4faf0f32c72aff78081"} Apr 22 18:52:15.377647 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:15.377414 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7501306fd833f8c4ee4f0e07f25be87a2ee87a33c726c4faf0f32c72aff78081" Apr 22 18:52:18.193754 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.193716 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6"] Apr 22 18:52:18.194198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.194078 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerName="extract" Apr 22 18:52:18.194198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.194094 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerName="extract" Apr 22 18:52:18.194198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.194108 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerName="pull" Apr 22 18:52:18.194198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.194113 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerName="pull" Apr 22 18:52:18.194198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.194125 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerName="util" Apr 22 18:52:18.194198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.194132 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerName="util" Apr 22 18:52:18.194198 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.194197 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2086c9c4-a688-4bc7-b802-54b5e27d40e4" containerName="extract" Apr 22 18:52:18.198357 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.198341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.200738 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.200690 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:52:18.200864 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.200800 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-lgqtx\"" Apr 22 18:52:18.200864 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.200816 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:52:18.200979 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.200910 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:52:18.206192 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.206170 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6"] Apr 22 18:52:18.361987 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.361944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbjkk\" (UniqueName: \"kubernetes.io/projected/b9d2d385-d381-4d6c-a339-4e1d111cc414-kube-api-access-nbjkk\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6\" (UID: \"b9d2d385-d381-4d6c-a339-4e1d111cc414\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.362253 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.362183 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b9d2d385-d381-4d6c-a339-4e1d111cc414-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6\" (UID: \"b9d2d385-d381-4d6c-a339-4e1d111cc414\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.463320 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.463238 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b9d2d385-d381-4d6c-a339-4e1d111cc414-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6\" (UID: \"b9d2d385-d381-4d6c-a339-4e1d111cc414\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.463469 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.463326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbjkk\" (UniqueName: \"kubernetes.io/projected/b9d2d385-d381-4d6c-a339-4e1d111cc414-kube-api-access-nbjkk\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6\" (UID: \"b9d2d385-d381-4d6c-a339-4e1d111cc414\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.465924 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.465897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b9d2d385-d381-4d6c-a339-4e1d111cc414-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6\" (UID: \"b9d2d385-d381-4d6c-a339-4e1d111cc414\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.472186 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.472156 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbjkk\" (UniqueName: \"kubernetes.io/projected/b9d2d385-d381-4d6c-a339-4e1d111cc414-kube-api-access-nbjkk\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6\" (UID: \"b9d2d385-d381-4d6c-a339-4e1d111cc414\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.509036 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.508990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:18.631999 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:18.631971 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6"] Apr 22 18:52:18.634337 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:52:18.634302 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9d2d385_d381_4d6c_a339_4e1d111cc414.slice/crio-a2f4e53e702e377a11372660ee9230ff8b326a1c24e7b6d1cc4ae6249acc7956 WatchSource:0}: Error finding container a2f4e53e702e377a11372660ee9230ff8b326a1c24e7b6d1cc4ae6249acc7956: Status 404 returned error can't find the container with id a2f4e53e702e377a11372660ee9230ff8b326a1c24e7b6d1cc4ae6249acc7956 Apr 22 18:52:19.396589 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:19.396549 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" event={"ID":"b9d2d385-d381-4d6c-a339-4e1d111cc414","Type":"ContainerStarted","Data":"a2f4e53e702e377a11372660ee9230ff8b326a1c24e7b6d1cc4ae6249acc7956"} Apr 22 18:52:22.408536 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.408502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" event={"ID":"b9d2d385-d381-4d6c-a339-4e1d111cc414","Type":"ContainerStarted","Data":"df7098ef588b06c1ada50305cd1897ab70462f7d5729d258e660a01dee17291c"} Apr 22 18:52:22.408944 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.408617 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:22.429258 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.429207 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" podStartSLOduration=0.86784731 podStartE2EDuration="4.429191179s" podCreationTimestamp="2026-04-22 18:52:18 +0000 UTC" firstStartedPulling="2026-04-22 18:52:18.635973864 +0000 UTC m=+334.967482295" lastFinishedPulling="2026-04-22 18:52:22.197317734 +0000 UTC m=+338.528826164" observedRunningTime="2026-04-22 18:52:22.427294362 +0000 UTC m=+338.758802807" watchObservedRunningTime="2026-04-22 18:52:22.429191179 +0000 UTC m=+338.760699631" Apr 22 18:52:22.692714 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.692681 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-m4njj"] Apr 22 18:52:22.695919 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.695902 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.698137 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.698116 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-w8zbk\"" Apr 22 18:52:22.698339 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.698322 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:52:22.698422 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.698338 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:52:22.702912 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.702892 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-m4njj"] Apr 22 18:52:22.803730 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.803695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.803918 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.803770 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hp57\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-kube-api-access-5hp57\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.803918 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.803821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e5f2627e-c328-4aa3-a3db-a86495d60aaa-cabundle0\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.904772 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.904737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.904969 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.904792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hp57\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-kube-api-access-5hp57\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.904969 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.904822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e5f2627e-c328-4aa3-a3db-a86495d60aaa-cabundle0\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.904969 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:22.904909 2570 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 18:52:22.904969 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:22.904937 2570 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:22.904969 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:22.904948 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:22.904969 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:22.904964 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-m4njj: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 18:52:22.905329 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:22.905037 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates podName:e5f2627e-c328-4aa3-a3db-a86495d60aaa nodeName:}" failed. No retries permitted until 2026-04-22 18:52:23.405003229 +0000 UTC m=+339.736511666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates") pod "keda-operator-ffbb595cb-m4njj" (UID: "e5f2627e-c328-4aa3-a3db-a86495d60aaa") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 18:52:22.905504 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.905485 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e5f2627e-c328-4aa3-a3db-a86495d60aaa-cabundle0\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:22.917672 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:22.917645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hp57\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-kube-api-access-5hp57\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:23.038294 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.038218 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5"] Apr 22 18:52:23.041895 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.041864 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.044737 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.044715 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:52:23.051221 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.051200 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5"] Apr 22 18:52:23.208086 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.208045 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqmg\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-kube-api-access-5kqmg\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.208259 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.208105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/21334591-d82d-4eca-b101-a7c25c54fe6b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.208259 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.208210 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.275830 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.275793 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-xvxs2"] Apr 22 18:52:23.279402 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.279381 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:23.282046 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.282002 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:52:23.290890 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.290817 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xvxs2"] Apr 22 18:52:23.309522 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.309496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.309650 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.309583 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqmg\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-kube-api-access-5kqmg\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.309650 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.309608 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/21334591-d82d-4eca-b101-a7c25c54fe6b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.309650 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.309642 2570 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:23.309802 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.309662 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:23.309802 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.309684 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5: references non-existent secret key: tls.crt Apr 22 18:52:23.309802 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.309751 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates podName:21334591-d82d-4eca-b101-a7c25c54fe6b nodeName:}" failed. No retries permitted until 2026-04-22 18:52:23.809733035 +0000 UTC m=+340.141241469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates") pod "keda-metrics-apiserver-7c9f485588-l5jl5" (UID: "21334591-d82d-4eca-b101-a7c25c54fe6b") : references non-existent secret key: tls.crt Apr 22 18:52:23.309959 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.309943 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/21334591-d82d-4eca-b101-a7c25c54fe6b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.318598 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.318577 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqmg\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-kube-api-access-5kqmg\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.410692 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.410653 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:23.411184 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.410712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857wh\" (UniqueName: \"kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-kube-api-access-857wh\") pod \"keda-admission-cf49989db-xvxs2\" (UID: \"6f3cd941-436c-460c-86bc-3931a9dda5c1\") " pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:23.411184 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.410801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-certificates\") pod \"keda-admission-cf49989db-xvxs2\" (UID: \"6f3cd941-436c-460c-86bc-3931a9dda5c1\") " pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:23.411184 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.410998 2570 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:23.411184 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.411058 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:23.411184 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.411072 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-m4njj: references non-existent secret key: ca.crt Apr 22 18:52:23.411416 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.411245 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates podName:e5f2627e-c328-4aa3-a3db-a86495d60aaa nodeName:}" failed. No retries permitted until 2026-04-22 18:52:24.411189168 +0000 UTC m=+340.742697600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates") pod "keda-operator-ffbb595cb-m4njj" (UID: "e5f2627e-c328-4aa3-a3db-a86495d60aaa") : references non-existent secret key: ca.crt Apr 22 18:52:23.511924 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.511877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-857wh\" (UniqueName: \"kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-kube-api-access-857wh\") pod \"keda-admission-cf49989db-xvxs2\" (UID: \"6f3cd941-436c-460c-86bc-3931a9dda5c1\") " pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:23.512159 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.511955 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-certificates\") pod \"keda-admission-cf49989db-xvxs2\" (UID: \"6f3cd941-436c-460c-86bc-3931a9dda5c1\") " pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:23.512159 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.512093 2570 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 18:52:23.512159 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.512118 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-xvxs2: secret "keda-admission-webhooks-certs" not found Apr 22 18:52:23.512347 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.512183 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-certificates podName:6f3cd941-436c-460c-86bc-3931a9dda5c1 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:24.012164119 +0000 UTC m=+340.343672555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-certificates") pod "keda-admission-cf49989db-xvxs2" (UID: "6f3cd941-436c-460c-86bc-3931a9dda5c1") : secret "keda-admission-webhooks-certs" not found Apr 22 18:52:23.525249 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.525208 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-857wh\" (UniqueName: \"kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-kube-api-access-857wh\") pod \"keda-admission-cf49989db-xvxs2\" (UID: \"6f3cd941-436c-460c-86bc-3931a9dda5c1\") " pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:23.813832 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:23.813801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:23.813998 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.813929 2570 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:23.813998 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.813942 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:23.813998 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.813958 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5: references non-existent secret key: tls.crt Apr 22 18:52:23.814123 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:23.814007 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates podName:21334591-d82d-4eca-b101-a7c25c54fe6b nodeName:}" failed. No retries permitted until 2026-04-22 18:52:24.813993451 +0000 UTC m=+341.145501894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates") pod "keda-metrics-apiserver-7c9f485588-l5jl5" (UID: "21334591-d82d-4eca-b101-a7c25c54fe6b") : references non-existent secret key: tls.crt Apr 22 18:52:24.015695 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:24.015656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-certificates\") pod \"keda-admission-cf49989db-xvxs2\" (UID: \"6f3cd941-436c-460c-86bc-3931a9dda5c1\") " pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:24.018199 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:24.018181 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3cd941-436c-460c-86bc-3931a9dda5c1-certificates\") pod \"keda-admission-cf49989db-xvxs2\" (UID: \"6f3cd941-436c-460c-86bc-3931a9dda5c1\") " pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:24.191331 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:24.191297 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:24.327249 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:24.327221 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xvxs2"] Apr 22 18:52:24.329670 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:52:24.329642 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3cd941_436c_460c_86bc_3931a9dda5c1.slice/crio-28055da215eb4dc472490552364b3f8172ed5ce4b6518825f0a3919546128a63 WatchSource:0}: Error finding container 28055da215eb4dc472490552364b3f8172ed5ce4b6518825f0a3919546128a63: Status 404 returned error can't find the container with id 28055da215eb4dc472490552364b3f8172ed5ce4b6518825f0a3919546128a63 Apr 22 18:52:24.416812 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:24.416776 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xvxs2" event={"ID":"6f3cd941-436c-460c-86bc-3931a9dda5c1","Type":"ContainerStarted","Data":"28055da215eb4dc472490552364b3f8172ed5ce4b6518825f0a3919546128a63"} Apr 22 18:52:24.419306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:24.419271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:24.419420 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.419399 2570 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:24.419420 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.419415 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:24.419533 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.419427 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-m4njj: references non-existent secret key: ca.crt Apr 22 18:52:24.419533 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.419489 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates podName:e5f2627e-c328-4aa3-a3db-a86495d60aaa nodeName:}" failed. No retries permitted until 2026-04-22 18:52:26.419470608 +0000 UTC m=+342.750979057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates") pod "keda-operator-ffbb595cb-m4njj" (UID: "e5f2627e-c328-4aa3-a3db-a86495d60aaa") : references non-existent secret key: ca.crt Apr 22 18:52:24.824242 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:24.824208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:24.824409 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.824353 2570 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:52:24.824409 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.824373 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:52:24.824409 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.824392 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5: references non-existent secret key: tls.crt Apr 22 18:52:24.824508 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:24.824448 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates podName:21334591-d82d-4eca-b101-a7c25c54fe6b nodeName:}" failed. No retries permitted until 2026-04-22 18:52:26.824434865 +0000 UTC m=+343.155943294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates") pod "keda-metrics-apiserver-7c9f485588-l5jl5" (UID: "21334591-d82d-4eca-b101-a7c25c54fe6b") : references non-existent secret key: tls.crt Apr 22 18:52:26.424318 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:26.424284 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xvxs2" event={"ID":"6f3cd941-436c-460c-86bc-3931a9dda5c1","Type":"ContainerStarted","Data":"47aa66debf856e30bed1d7ddc4435ec1026ebc70e606354947f3dd36d7f2fafe"} Apr 22 18:52:26.424662 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:26.424501 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:26.438536 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:26.438513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:26.438732 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:26.438717 2570 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:26.438768 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:26.438739 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:26.438768 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:26.438751 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-m4njj: references non-existent secret key: ca.crt Apr 22 18:52:26.438831 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:52:26.438814 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates podName:e5f2627e-c328-4aa3-a3db-a86495d60aaa nodeName:}" failed. No retries permitted until 2026-04-22 18:52:30.438793172 +0000 UTC m=+346.770301621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates") pod "keda-operator-ffbb595cb-m4njj" (UID: "e5f2627e-c328-4aa3-a3db-a86495d60aaa") : references non-existent secret key: ca.crt Apr 22 18:52:26.441786 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:26.441750 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-xvxs2" podStartSLOduration=1.45783891 podStartE2EDuration="3.441737926s" podCreationTimestamp="2026-04-22 18:52:23 +0000 UTC" firstStartedPulling="2026-04-22 18:52:24.330826312 +0000 UTC m=+340.662334743" lastFinishedPulling="2026-04-22 18:52:26.314725328 +0000 UTC m=+342.646233759" observedRunningTime="2026-04-22 18:52:26.440456438 +0000 UTC m=+342.771964889" watchObservedRunningTime="2026-04-22 18:52:26.441737926 +0000 UTC m=+342.773246377" Apr 22 18:52:26.841772 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:26.841737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:26.844296 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:26.844266 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21334591-d82d-4eca-b101-a7c25c54fe6b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-l5jl5\" (UID: \"21334591-d82d-4eca-b101-a7c25c54fe6b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:26.953387 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:26.953354 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:27.071526 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:27.071502 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5"] Apr 22 18:52:27.073543 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:52:27.073508 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21334591_d82d_4eca_b101_a7c25c54fe6b.slice/crio-0fe386ffaf75c2d3bcea8a80948a6943a6fd659a0d15b30807d7728ba9422741 WatchSource:0}: Error finding container 0fe386ffaf75c2d3bcea8a80948a6943a6fd659a0d15b30807d7728ba9422741: Status 404 returned error can't find the container with id 0fe386ffaf75c2d3bcea8a80948a6943a6fd659a0d15b30807d7728ba9422741 Apr 22 18:52:27.428308 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:27.428274 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" event={"ID":"21334591-d82d-4eca-b101-a7c25c54fe6b","Type":"ContainerStarted","Data":"0fe386ffaf75c2d3bcea8a80948a6943a6fd659a0d15b30807d7728ba9422741"} Apr 22 18:52:30.439492 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:30.439450 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" event={"ID":"21334591-d82d-4eca-b101-a7c25c54fe6b","Type":"ContainerStarted","Data":"cc1917c4fc76e89917cb6a8747045fc8f658f209e4a1fdd88b7a154f571f6061"} Apr 22 18:52:30.439855 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:30.439581 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:30.457408 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:30.457360 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" podStartSLOduration=4.965491738 podStartE2EDuration="7.457347326s" podCreationTimestamp="2026-04-22 18:52:23 +0000 UTC" firstStartedPulling="2026-04-22 18:52:27.074809177 +0000 UTC m=+343.406317607" lastFinishedPulling="2026-04-22 18:52:29.566664749 +0000 UTC m=+345.898173195" observedRunningTime="2026-04-22 18:52:30.455832989 +0000 UTC m=+346.787341464" watchObservedRunningTime="2026-04-22 18:52:30.457347326 +0000 UTC m=+346.788855777" Apr 22 18:52:30.476004 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:30.475980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:30.478142 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:30.478121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5f2627e-c328-4aa3-a3db-a86495d60aaa-certificates\") pod \"keda-operator-ffbb595cb-m4njj\" (UID: \"e5f2627e-c328-4aa3-a3db-a86495d60aaa\") " pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:30.508603 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:30.508581 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:30.833928 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:30.833900 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-m4njj"] Apr 22 18:52:30.836202 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:52:30.836173 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5f2627e_c328_4aa3_a3db_a86495d60aaa.slice/crio-6285e89e498994ec5cd7f9bfca7d458aacaf30f88f324aee581446f28b2eece9 WatchSource:0}: Error finding container 6285e89e498994ec5cd7f9bfca7d458aacaf30f88f324aee581446f28b2eece9: Status 404 returned error can't find the container with id 6285e89e498994ec5cd7f9bfca7d458aacaf30f88f324aee581446f28b2eece9 Apr 22 18:52:31.445061 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:31.445005 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-m4njj" event={"ID":"e5f2627e-c328-4aa3-a3db-a86495d60aaa","Type":"ContainerStarted","Data":"6285e89e498994ec5cd7f9bfca7d458aacaf30f88f324aee581446f28b2eece9"} Apr 22 18:52:34.456614 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:34.456578 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-m4njj" event={"ID":"e5f2627e-c328-4aa3-a3db-a86495d60aaa","Type":"ContainerStarted","Data":"5a40477c7f253ee03696e427119d9a125f1aef8e31e8846ecb856dc4a73af908"} Apr 22 18:52:34.457078 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:34.456674 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:52:34.472945 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:34.472889 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-m4njj" podStartSLOduration=9.740982541 podStartE2EDuration="12.472874746s" podCreationTimestamp="2026-04-22 18:52:22 +0000 UTC" firstStartedPulling="2026-04-22 18:52:30.837692631 +0000 UTC m=+347.169201075" lastFinishedPulling="2026-04-22 18:52:33.569584834 +0000 UTC m=+349.901093280" observedRunningTime="2026-04-22 18:52:34.471819662 +0000 UTC m=+350.803328114" watchObservedRunningTime="2026-04-22 18:52:34.472874746 +0000 UTC m=+350.804383200" Apr 22 18:52:41.449965 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:41.449934 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-l5jl5" Apr 22 18:52:43.414857 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:43.414826 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bz2l6" Apr 22 18:52:47.431460 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:47.431421 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-xvxs2" Apr 22 18:52:55.462271 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:52:55.462240 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-m4njj" Apr 22 18:53:28.772421 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.772388 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-vj77p"] Apr 22 18:53:28.775827 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.775800 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:28.780891 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.780861 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:53:28.782302 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.782188 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:53:28.782628 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.782460 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-fgxcd\"" Apr 22 18:53:28.782628 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.782472 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:53:28.785295 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.785271 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v"] Apr 22 18:53:28.789558 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.789541 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-vj77p"] Apr 22 18:53:28.789791 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.789773 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:28.792570 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.792385 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-7zsw5\"" Apr 22 18:53:28.792676 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.792607 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:53:28.793735 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.793715 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v"] Apr 22 18:53:28.868690 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.868659 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkmxj\" (UniqueName: \"kubernetes.io/projected/1ab8212a-8821-43a9-b747-9d4ab798c946-kube-api-access-kkmxj\") pod \"kserve-controller-manager-6f655776dd-vj77p\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:28.868860 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.868695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k49c\" (UniqueName: \"kubernetes.io/projected/0ae9a00d-a015-437f-8f4b-bef203a3c1e7-kube-api-access-9k49c\") pod \"llmisvc-controller-manager-68cc5db7c4-pfj6v\" (UID: \"0ae9a00d-a015-437f-8f4b-bef203a3c1e7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:28.868860 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.868828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ab8212a-8821-43a9-b747-9d4ab798c946-cert\") pod \"kserve-controller-manager-6f655776dd-vj77p\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:28.868932 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.868904 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ae9a00d-a015-437f-8f4b-bef203a3c1e7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-pfj6v\" (UID: \"0ae9a00d-a015-437f-8f4b-bef203a3c1e7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:28.970363 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.970328 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ae9a00d-a015-437f-8f4b-bef203a3c1e7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-pfj6v\" (UID: \"0ae9a00d-a015-437f-8f4b-bef203a3c1e7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:28.970544 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.970385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkmxj\" (UniqueName: \"kubernetes.io/projected/1ab8212a-8821-43a9-b747-9d4ab798c946-kube-api-access-kkmxj\") pod \"kserve-controller-manager-6f655776dd-vj77p\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:28.970544 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.970405 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k49c\" (UniqueName: \"kubernetes.io/projected/0ae9a00d-a015-437f-8f4b-bef203a3c1e7-kube-api-access-9k49c\") pod \"llmisvc-controller-manager-68cc5db7c4-pfj6v\" (UID: \"0ae9a00d-a015-437f-8f4b-bef203a3c1e7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:28.970544 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.970456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ab8212a-8821-43a9-b747-9d4ab798c946-cert\") pod \"kserve-controller-manager-6f655776dd-vj77p\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:28.972928 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.972905 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ae9a00d-a015-437f-8f4b-bef203a3c1e7-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-pfj6v\" (UID: \"0ae9a00d-a015-437f-8f4b-bef203a3c1e7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:28.973071 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.972951 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ab8212a-8821-43a9-b747-9d4ab798c946-cert\") pod \"kserve-controller-manager-6f655776dd-vj77p\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:28.983559 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.983530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k49c\" (UniqueName: \"kubernetes.io/projected/0ae9a00d-a015-437f-8f4b-bef203a3c1e7-kube-api-access-9k49c\") pod \"llmisvc-controller-manager-68cc5db7c4-pfj6v\" (UID: \"0ae9a00d-a015-437f-8f4b-bef203a3c1e7\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:28.984193 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:28.984174 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkmxj\" (UniqueName: \"kubernetes.io/projected/1ab8212a-8821-43a9-b747-9d4ab798c946-kube-api-access-kkmxj\") pod \"kserve-controller-manager-6f655776dd-vj77p\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:29.094276 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:29.094181 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:29.104056 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:29.104026 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:29.230887 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:29.230859 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-vj77p"] Apr 22 18:53:29.232889 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:53:29.232862 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ab8212a_8821_43a9_b747_9d4ab798c946.slice/crio-483e263c64b395a06d158370a3ecebd29283c197d49be5f16667ac1b812fd6ce WatchSource:0}: Error finding container 483e263c64b395a06d158370a3ecebd29283c197d49be5f16667ac1b812fd6ce: Status 404 returned error can't find the container with id 483e263c64b395a06d158370a3ecebd29283c197d49be5f16667ac1b812fd6ce Apr 22 18:53:29.254666 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:29.254644 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v"] Apr 22 18:53:29.256441 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:53:29.256417 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0ae9a00d_a015_437f_8f4b_bef203a3c1e7.slice/crio-05719ad0cd76958c6252d6902187b9fef9c49e0714abb44d7cc8dd7a2c8091f8 WatchSource:0}: Error finding container 05719ad0cd76958c6252d6902187b9fef9c49e0714abb44d7cc8dd7a2c8091f8: Status 404 returned error can't find the container with id 05719ad0cd76958c6252d6902187b9fef9c49e0714abb44d7cc8dd7a2c8091f8 Apr 22 18:53:29.638620 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:29.638583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" event={"ID":"1ab8212a-8821-43a9-b747-9d4ab798c946","Type":"ContainerStarted","Data":"483e263c64b395a06d158370a3ecebd29283c197d49be5f16667ac1b812fd6ce"} Apr 22 18:53:29.639598 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:29.639578 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" event={"ID":"0ae9a00d-a015-437f-8f4b-bef203a3c1e7","Type":"ContainerStarted","Data":"05719ad0cd76958c6252d6902187b9fef9c49e0714abb44d7cc8dd7a2c8091f8"} Apr 22 18:53:32.653157 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:32.653122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" event={"ID":"1ab8212a-8821-43a9-b747-9d4ab798c946","Type":"ContainerStarted","Data":"5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0"} Apr 22 18:53:32.653574 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:32.653239 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:53:32.670184 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:32.670141 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" podStartSLOduration=1.381931009 podStartE2EDuration="4.670127442s" podCreationTimestamp="2026-04-22 18:53:28 +0000 UTC" firstStartedPulling="2026-04-22 18:53:29.234647825 +0000 UTC m=+405.566156256" lastFinishedPulling="2026-04-22 18:53:32.522844244 +0000 UTC m=+408.854352689" observedRunningTime="2026-04-22 18:53:32.668938392 +0000 UTC m=+409.000446845" watchObservedRunningTime="2026-04-22 18:53:32.670127442 +0000 UTC m=+409.001635894" Apr 22 18:53:33.657322 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:33.657287 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" event={"ID":"0ae9a00d-a015-437f-8f4b-bef203a3c1e7","Type":"ContainerStarted","Data":"ff0ce00e2da9cf19a2d0910e577d8b8313c5ae707f725d4c01b948b0b8a044a6"} Apr 22 18:53:33.657815 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:33.657501 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:53:33.675594 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:53:33.675545 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" podStartSLOduration=2.371111897 podStartE2EDuration="5.675530609s" podCreationTimestamp="2026-04-22 18:53:28 +0000 UTC" firstStartedPulling="2026-04-22 18:53:29.257675609 +0000 UTC m=+405.589184042" lastFinishedPulling="2026-04-22 18:53:32.562094309 +0000 UTC m=+408.893602754" observedRunningTime="2026-04-22 18:53:33.67353054 +0000 UTC m=+410.005038992" watchObservedRunningTime="2026-04-22 18:53:33.675530609 +0000 UTC m=+410.007039061" Apr 22 18:54:03.662899 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:03.662824 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:54:04.664502 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:04.664470 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pfj6v" Apr 22 18:54:06.257426 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.257396 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-vj77p"] Apr 22 18:54:06.257820 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.257616 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" podUID="1ab8212a-8821-43a9-b747-9d4ab798c946" containerName="manager" containerID="cri-o://5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0" gracePeriod=10 Apr 22 18:54:06.285443 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.285417 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-pzqhj"] Apr 22 18:54:06.288743 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.288726 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.297342 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.297317 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-pzqhj"] Apr 22 18:54:06.397608 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.397574 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd482c2d-ea55-4741-bb2e-ade54e49f678-cert\") pod \"kserve-controller-manager-6f655776dd-pzqhj\" (UID: \"cd482c2d-ea55-4741-bb2e-ade54e49f678\") " pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.397750 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.397617 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vgn\" (UniqueName: \"kubernetes.io/projected/cd482c2d-ea55-4741-bb2e-ade54e49f678-kube-api-access-w7vgn\") pod \"kserve-controller-manager-6f655776dd-pzqhj\" (UID: \"cd482c2d-ea55-4741-bb2e-ade54e49f678\") " pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.493508 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.493488 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:54:06.498311 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.498291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd482c2d-ea55-4741-bb2e-ade54e49f678-cert\") pod \"kserve-controller-manager-6f655776dd-pzqhj\" (UID: \"cd482c2d-ea55-4741-bb2e-ade54e49f678\") " pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.498407 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.498321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vgn\" (UniqueName: \"kubernetes.io/projected/cd482c2d-ea55-4741-bb2e-ade54e49f678-kube-api-access-w7vgn\") pod \"kserve-controller-manager-6f655776dd-pzqhj\" (UID: \"cd482c2d-ea55-4741-bb2e-ade54e49f678\") " pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.500527 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.500508 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd482c2d-ea55-4741-bb2e-ade54e49f678-cert\") pod \"kserve-controller-manager-6f655776dd-pzqhj\" (UID: \"cd482c2d-ea55-4741-bb2e-ade54e49f678\") " pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.506745 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.506724 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vgn\" (UniqueName: \"kubernetes.io/projected/cd482c2d-ea55-4741-bb2e-ade54e49f678-kube-api-access-w7vgn\") pod \"kserve-controller-manager-6f655776dd-pzqhj\" (UID: \"cd482c2d-ea55-4741-bb2e-ade54e49f678\") " pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.599164 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.599093 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkmxj\" (UniqueName: \"kubernetes.io/projected/1ab8212a-8821-43a9-b747-9d4ab798c946-kube-api-access-kkmxj\") pod \"1ab8212a-8821-43a9-b747-9d4ab798c946\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " Apr 22 18:54:06.599164 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.599123 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ab8212a-8821-43a9-b747-9d4ab798c946-cert\") pod \"1ab8212a-8821-43a9-b747-9d4ab798c946\" (UID: \"1ab8212a-8821-43a9-b747-9d4ab798c946\") " Apr 22 18:54:06.601152 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.601113 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab8212a-8821-43a9-b747-9d4ab798c946-kube-api-access-kkmxj" (OuterVolumeSpecName: "kube-api-access-kkmxj") pod "1ab8212a-8821-43a9-b747-9d4ab798c946" (UID: "1ab8212a-8821-43a9-b747-9d4ab798c946"). InnerVolumeSpecName "kube-api-access-kkmxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:06.601152 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.601145 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab8212a-8821-43a9-b747-9d4ab798c946-cert" (OuterVolumeSpecName: "cert") pod "1ab8212a-8821-43a9-b747-9d4ab798c946" (UID: "1ab8212a-8821-43a9-b747-9d4ab798c946"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:54:06.640490 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.640455 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:06.700490 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.700456 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kkmxj\" (UniqueName: \"kubernetes.io/projected/1ab8212a-8821-43a9-b747-9d4ab798c946-kube-api-access-kkmxj\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:54:06.700490 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.700486 2570 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ab8212a-8821-43a9-b747-9d4ab798c946-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:54:06.757969 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.757942 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-pzqhj"] Apr 22 18:54:06.760435 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:54:06.760398 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd482c2d_ea55_4741_bb2e_ade54e49f678.slice/crio-444e378c89a2332d3545763f081cd5139653f9969b9c24930e5a838c3896d7c6 WatchSource:0}: Error finding container 444e378c89a2332d3545763f081cd5139653f9969b9c24930e5a838c3896d7c6: Status 404 returned error can't find the container with id 444e378c89a2332d3545763f081cd5139653f9969b9c24930e5a838c3896d7c6 Apr 22 18:54:06.766880 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.766857 2570 generic.go:358] "Generic (PLEG): container finished" podID="1ab8212a-8821-43a9-b747-9d4ab798c946" containerID="5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0" exitCode=0 Apr 22 18:54:06.766971 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.766915 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" Apr 22 18:54:06.766971 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.766938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" event={"ID":"1ab8212a-8821-43a9-b747-9d4ab798c946","Type":"ContainerDied","Data":"5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0"} Apr 22 18:54:06.767062 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.766969 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-vj77p" event={"ID":"1ab8212a-8821-43a9-b747-9d4ab798c946","Type":"ContainerDied","Data":"483e263c64b395a06d158370a3ecebd29283c197d49be5f16667ac1b812fd6ce"} Apr 22 18:54:06.767062 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.766986 2570 scope.go:117] "RemoveContainer" containerID="5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0" Apr 22 18:54:06.768394 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.768374 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" event={"ID":"cd482c2d-ea55-4741-bb2e-ade54e49f678","Type":"ContainerStarted","Data":"444e378c89a2332d3545763f081cd5139653f9969b9c24930e5a838c3896d7c6"} Apr 22 18:54:06.775858 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.775837 2570 scope.go:117] "RemoveContainer" containerID="5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0" Apr 22 18:54:06.776126 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:54:06.776106 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0\": container with ID starting with 5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0 not found: ID does not exist" containerID="5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0" Apr 22 18:54:06.776229 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.776132 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0"} err="failed to get container status \"5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0\": rpc error: code = NotFound desc = could not find container \"5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0\": container with ID starting with 5e382d25ba7da66197486859ce479a40235cf6deae79541845d778cf9ee470d0 not found: ID does not exist" Apr 22 18:54:06.789741 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.789717 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-vj77p"] Apr 22 18:54:06.796616 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:06.796592 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-vj77p"] Apr 22 18:54:07.774062 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:07.774029 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" event={"ID":"cd482c2d-ea55-4741-bb2e-ade54e49f678","Type":"ContainerStarted","Data":"6b71114751685991b2320dbed2d9aac44067e5ad326b1e871437fb464cc23828"} Apr 22 18:54:07.774500 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:07.774180 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:54:07.795068 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:07.795001 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" podStartSLOduration=1.387015853 podStartE2EDuration="1.794987644s" podCreationTimestamp="2026-04-22 18:54:06 +0000 UTC" firstStartedPulling="2026-04-22 18:54:06.761699077 +0000 UTC m=+443.093207510" lastFinishedPulling="2026-04-22 18:54:07.169670871 +0000 UTC m=+443.501179301" observedRunningTime="2026-04-22 18:54:07.792654065 +0000 UTC m=+444.124162519" watchObservedRunningTime="2026-04-22 18:54:07.794987644 +0000 UTC m=+444.126496096" Apr 22 18:54:08.248229 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:08.248198 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab8212a-8821-43a9-b747-9d4ab798c946" path="/var/lib/kubelet/pods/1ab8212a-8821-43a9-b747-9d4ab798c946/volumes" Apr 22 18:54:38.781737 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:54:38.781707 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-pzqhj" Apr 22 18:55:10.502226 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.502192 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d5598dc5d-8fbh5"] Apr 22 18:55:10.502973 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.502644 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ab8212a-8821-43a9-b747-9d4ab798c946" containerName="manager" Apr 22 18:55:10.502973 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.502663 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab8212a-8821-43a9-b747-9d4ab798c946" containerName="manager" Apr 22 18:55:10.502973 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.502766 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ab8212a-8821-43a9-b747-9d4ab798c946" containerName="manager" Apr 22 18:55:10.506149 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.506126 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.517796 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.517772 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5598dc5d-8fbh5"] Apr 22 18:55:10.525995 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.525972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-serving-cert\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.526106 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.526041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-trusted-ca-bundle\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.526106 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.526085 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-config\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.526181 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.526105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-oauth-serving-cert\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.526215 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.526172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hff4\" (UniqueName: \"kubernetes.io/projected/3b4f4c6c-d844-482a-ae04-85045e83a1b7-kube-api-access-4hff4\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.526249 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.526219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-service-ca\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.526282 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.526250 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-oauth-config\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.627350 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.627316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-oauth-serving-cert\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.627350 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.627365 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hff4\" (UniqueName: \"kubernetes.io/projected/3b4f4c6c-d844-482a-ae04-85045e83a1b7-kube-api-access-4hff4\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.627586 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.627405 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-service-ca\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.627586 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.627431 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-oauth-config\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.627691 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.627574 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-serving-cert\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.627691 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.627660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-trusted-ca-bundle\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.627792 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.627719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-config\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.628144 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.628096 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-oauth-serving-cert\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.628342 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.628322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-config\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.628342 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.628331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-service-ca\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.628557 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.628537 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4f4c6c-d844-482a-ae04-85045e83a1b7-trusted-ca-bundle\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.629842 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.629816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-oauth-config\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.630149 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.630128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b4f4c6c-d844-482a-ae04-85045e83a1b7-console-serving-cert\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.636554 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.636530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hff4\" (UniqueName: \"kubernetes.io/projected/3b4f4c6c-d844-482a-ae04-85045e83a1b7-kube-api-access-4hff4\") pod \"console-5d5598dc5d-8fbh5\" (UID: \"3b4f4c6c-d844-482a-ae04-85045e83a1b7\") " pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:10.816966 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:10.816891 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:11.146108 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:11.146083 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5598dc5d-8fbh5"] Apr 22 18:55:11.148695 ip-10-0-133-163 kubenswrapper[2570]: W0422 18:55:11.148665 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4f4c6c_d844_482a_ae04_85045e83a1b7.slice/crio-036755b67f81746f0628f5e9ee963d5d5090d569b7a3fa7da5de31e63c3ae11f WatchSource:0}: Error finding container 036755b67f81746f0628f5e9ee963d5d5090d569b7a3fa7da5de31e63c3ae11f: Status 404 returned error can't find the container with id 036755b67f81746f0628f5e9ee963d5d5090d569b7a3fa7da5de31e63c3ae11f Apr 22 18:55:11.978908 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:11.978867 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5598dc5d-8fbh5" event={"ID":"3b4f4c6c-d844-482a-ae04-85045e83a1b7","Type":"ContainerStarted","Data":"e055b2542cb2a41a8b96e7ebbfc90c8e8196095a532445c02d0655618b72ff6b"} Apr 22 18:55:11.978908 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:11.978908 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5598dc5d-8fbh5" event={"ID":"3b4f4c6c-d844-482a-ae04-85045e83a1b7","Type":"ContainerStarted","Data":"036755b67f81746f0628f5e9ee963d5d5090d569b7a3fa7da5de31e63c3ae11f"} Apr 22 18:55:11.998004 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:11.997955 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d5598dc5d-8fbh5" podStartSLOduration=1.997942117 podStartE2EDuration="1.997942117s" podCreationTimestamp="2026-04-22 18:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:55:11.995948498 +0000 UTC m=+508.327456962" watchObservedRunningTime="2026-04-22 18:55:11.997942117 +0000 UTC m=+508.329450569" Apr 22 18:55:20.817306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:20.817259 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:20.817306 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:20.817311 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:20.822004 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:20.821980 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:21.010248 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:21.010222 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d5598dc5d-8fbh5" Apr 22 18:55:21.057872 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:21.057834 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59df7f7578-8hvcm"] Apr 22 18:55:46.078701 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.078590 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59df7f7578-8hvcm" podUID="42bba0ec-b46b-495c-ab02-c387ba33a97a" containerName="console" containerID="cri-o://af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690" gracePeriod=15 Apr 22 18:55:46.317756 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.317734 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59df7f7578-8hvcm_42bba0ec-b46b-495c-ab02-c387ba33a97a/console/0.log" Apr 22 18:55:46.317858 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.317794 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:55:46.439226 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439195 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-oauth-serving-cert\") pod \"42bba0ec-b46b-495c-ab02-c387ba33a97a\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " Apr 22 18:55:46.439395 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439233 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wmcl\" (UniqueName: \"kubernetes.io/projected/42bba0ec-b46b-495c-ab02-c387ba33a97a-kube-api-access-2wmcl\") pod \"42bba0ec-b46b-495c-ab02-c387ba33a97a\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " Apr 22 18:55:46.439395 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439293 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-service-ca\") pod \"42bba0ec-b46b-495c-ab02-c387ba33a97a\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " Apr 22 18:55:46.439395 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439313 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-trusted-ca-bundle\") pod \"42bba0ec-b46b-495c-ab02-c387ba33a97a\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " Apr 22 18:55:46.439395 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439347 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-oauth-config\") pod \"42bba0ec-b46b-495c-ab02-c387ba33a97a\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " Apr 22 18:55:46.439395 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439375 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-serving-cert\") pod \"42bba0ec-b46b-495c-ab02-c387ba33a97a\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " Apr 22 18:55:46.439647 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439474 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-config\") pod \"42bba0ec-b46b-495c-ab02-c387ba33a97a\" (UID: \"42bba0ec-b46b-495c-ab02-c387ba33a97a\") " Apr 22 18:55:46.439647 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439617 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42bba0ec-b46b-495c-ab02-c387ba33a97a" (UID: "42bba0ec-b46b-495c-ab02-c387ba33a97a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:46.439833 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439808 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-oauth-serving-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:55:46.439910 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439864 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42bba0ec-b46b-495c-ab02-c387ba33a97a" (UID: "42bba0ec-b46b-495c-ab02-c387ba33a97a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:46.439994 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.439962 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-config" (OuterVolumeSpecName: "console-config") pod "42bba0ec-b46b-495c-ab02-c387ba33a97a" (UID: "42bba0ec-b46b-495c-ab02-c387ba33a97a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:46.440313 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.440290 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-service-ca" (OuterVolumeSpecName: "service-ca") pod "42bba0ec-b46b-495c-ab02-c387ba33a97a" (UID: "42bba0ec-b46b-495c-ab02-c387ba33a97a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:55:46.441651 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.441627 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42bba0ec-b46b-495c-ab02-c387ba33a97a" (UID: "42bba0ec-b46b-495c-ab02-c387ba33a97a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:55:46.441750 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.441668 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42bba0ec-b46b-495c-ab02-c387ba33a97a-kube-api-access-2wmcl" (OuterVolumeSpecName: "kube-api-access-2wmcl") pod "42bba0ec-b46b-495c-ab02-c387ba33a97a" (UID: "42bba0ec-b46b-495c-ab02-c387ba33a97a"). InnerVolumeSpecName "kube-api-access-2wmcl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:46.441750 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.441700 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42bba0ec-b46b-495c-ab02-c387ba33a97a" (UID: "42bba0ec-b46b-495c-ab02-c387ba33a97a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:55:46.540391 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.540359 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2wmcl\" (UniqueName: \"kubernetes.io/projected/42bba0ec-b46b-495c-ab02-c387ba33a97a-kube-api-access-2wmcl\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:55:46.540391 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.540385 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-service-ca\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:55:46.540391 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.540396 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-trusted-ca-bundle\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:55:46.540608 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.540405 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-oauth-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:55:46.540608 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.540415 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-serving-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:55:46.540608 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:46.540425 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42bba0ec-b46b-495c-ab02-c387ba33a97a-console-config\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 22 18:55:47.090407 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.090376 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59df7f7578-8hvcm_42bba0ec-b46b-495c-ab02-c387ba33a97a/console/0.log" Apr 22 18:55:47.090862 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.090421 2570 generic.go:358] "Generic (PLEG): container finished" podID="42bba0ec-b46b-495c-ab02-c387ba33a97a" containerID="af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690" exitCode=2 Apr 22 18:55:47.090862 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.090470 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59df7f7578-8hvcm" event={"ID":"42bba0ec-b46b-495c-ab02-c387ba33a97a","Type":"ContainerDied","Data":"af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690"} Apr 22 18:55:47.090862 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.090482 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59df7f7578-8hvcm" Apr 22 18:55:47.090862 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.090502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59df7f7578-8hvcm" event={"ID":"42bba0ec-b46b-495c-ab02-c387ba33a97a","Type":"ContainerDied","Data":"6a1f4a8a025d1b11e1a308b14e3a06bb587cd68e256b6b95321d8cf0b267cf0a"} Apr 22 18:55:47.090862 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.090523 2570 scope.go:117] "RemoveContainer" containerID="af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690" Apr 22 18:55:47.098786 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.098770 2570 scope.go:117] "RemoveContainer" containerID="af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690" Apr 22 18:55:47.099031 ip-10-0-133-163 kubenswrapper[2570]: E0422 18:55:47.098992 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690\": container with ID starting with af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690 not found: ID does not exist" containerID="af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690" Apr 22 18:55:47.099084 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.099040 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690"} err="failed to get container status \"af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690\": rpc error: code = NotFound desc = could not find container \"af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690\": container with ID starting with af7bc12e7c289fe3b5b2de89c4d8db2a8724d00e8ea84f3b28756cd097845690 not found: ID does not exist" Apr 22 18:55:47.113666 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.113647 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59df7f7578-8hvcm"] Apr 22 18:55:47.118001 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:47.117981 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59df7f7578-8hvcm"] Apr 22 18:55:48.253415 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:55:48.253379 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42bba0ec-b46b-495c-ab02-c387ba33a97a" path="/var/lib/kubelet/pods/42bba0ec-b46b-495c-ab02-c387ba33a97a/volumes" Apr 22 18:56:44.144122 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:56:44.144084 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 18:56:44.144593 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:56:44.144231 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 18:56:44.149806 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:56:44.149781 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 18:56:44.150000 ip-10-0-133-163 kubenswrapper[2570]: I0422 18:56:44.149984 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:01:44.174923 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:01:44.174841 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:01:44.176627 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:01:44.176600 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:01:44.182379 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:01:44.182361 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:01:44.183872 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:01:44.183853 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:06:44.199841 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:06:44.199809 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:06:44.202733 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:06:44.202711 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:06:44.205694 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:06:44.205675 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:06:44.208525 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:06:44.208504 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:11:44.224403 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:11:44.224372 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:11:44.228892 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:11:44.228871 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:11:44.230458 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:11:44.230438 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:11:44.234659 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:11:44.234642 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:16:44.254238 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:16:44.254202 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:16:44.260146 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:16:44.260122 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:16:44.263128 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:16:44.263107 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:16:44.268116 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:16:44.268096 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:21:44.275436 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:21:44.275410 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:21:44.281064 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:21:44.281043 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:21:44.285405 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:21:44.285387 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:21:44.290561 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:21:44.290544 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:26:44.296940 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:26:44.296915 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:26:44.302482 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:26:44.302457 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:26:44.307003 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:26:44.306985 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:26:44.311652 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:26:44.311637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:31:44.322762 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:31:44.322684 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:31:44.328519 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:31:44.328494 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:31:44.331313 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:31:44.331287 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:31:44.336774 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:31:44.336757 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:36:44.345913 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:36:44.345882 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:36:44.351915 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:36:44.351890 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:36:44.355132 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:36:44.355112 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:36:44.363226 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:36:44.363211 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:41:44.370145 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:41:44.370113 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:41:44.375975 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:41:44.375947 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:41:44.382034 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:41:44.381994 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:41:44.387911 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:41:44.387894 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:46:44.392532 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:46:44.392430 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:46:44.399542 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:46:44.398417 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:46:44.405754 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:46:44.405736 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:46:44.410877 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:46:44.410861 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:51:44.414830 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:51:44.414725 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:51:44.420712 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:51:44.420689 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:51:44.428799 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:51:44.428779 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:51:44.434059 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:51:44.434041 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:52:09.155163 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:09.155133 2570 ???:1] "http: TLS handshake error from 10.0.129.249:35442: EOF" Apr 22 19:52:09.162303 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:09.162277 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2l626_758c6e79-f76a-46e0-b353-55929fcd68c4/global-pull-secret-syncer/0.log" Apr 22 19:52:09.374449 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:09.374417 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kzsjh_188e6686-56b2-4173-8c6e-37c8297781e8/konnectivity-agent/0.log" Apr 22 19:52:09.465195 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:09.465129 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-163.ec2.internal_8147dca2f1846ffe58ac40c8a9cdfc0b/haproxy/0.log" Apr 22 19:52:12.531131 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.531104 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6ba36426-4392-43bf-b52e-099fcad1b911/alertmanager/0.log" Apr 22 19:52:12.565107 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.565083 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6ba36426-4392-43bf-b52e-099fcad1b911/config-reloader/0.log" Apr 22 19:52:12.595545 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.595525 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6ba36426-4392-43bf-b52e-099fcad1b911/kube-rbac-proxy-web/0.log" Apr 22 19:52:12.628820 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.628800 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6ba36426-4392-43bf-b52e-099fcad1b911/kube-rbac-proxy/0.log" Apr 22 19:52:12.656555 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.656537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6ba36426-4392-43bf-b52e-099fcad1b911/kube-rbac-proxy-metric/0.log" Apr 22 19:52:12.685527 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.685509 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6ba36426-4392-43bf-b52e-099fcad1b911/prom-label-proxy/0.log" Apr 22 19:52:12.711533 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.711508 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6ba36426-4392-43bf-b52e-099fcad1b911/init-config-reloader/0.log" Apr 22 19:52:12.760587 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.760565 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-qhml5_82e68df6-e973-4eb4-9f72-57676895ca9b/cluster-monitoring-operator/0.log" Apr 22 19:52:12.876109 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.876085 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fc6d96979-n9gdr_ba1c7391-3310-4c4f-84cd-552e47593e99/metrics-server/0.log" Apr 22 19:52:12.904774 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:12.904751 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zm2s7_d50ce585-dd21-4431-b64f-6f4220eb4fab/monitoring-plugin/0.log" Apr 22 19:52:13.015529 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:13.015506 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5bflg_46d280c1-d49f-4d77-8930-94aad7b2b5e2/node-exporter/0.log" Apr 22 19:52:13.040108 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:13.040085 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5bflg_46d280c1-d49f-4d77-8930-94aad7b2b5e2/kube-rbac-proxy/0.log" Apr 22 19:52:13.067041 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:13.067002 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5bflg_46d280c1-d49f-4d77-8930-94aad7b2b5e2/init-textfile/0.log" Apr 22 19:52:13.495224 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:13.495196 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-mxhqc_47dff405-7883-453c-b066-b2b17d2440a6/prometheus-operator/0.log" Apr 22 19:52:13.517239 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:13.517214 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-mxhqc_47dff405-7883-453c-b066-b2b17d2440a6/kube-rbac-proxy/0.log" Apr 22 19:52:13.565532 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:13.565513 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-gm7d8_b7b9bf0c-4a86-427f-881b-3915077f0c2d/prometheus-operator-admission-webhook/0.log" Apr 22 19:52:15.476949 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:15.476921 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/2.log" Apr 22 19:52:15.481198 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:15.481176 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nzr8k_26e711ad-d134-4b34-9606-2a2cb1b1f283/console-operator/3.log" Apr 22 19:52:15.882332 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:15.882303 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d5598dc5d-8fbh5_3b4f4c6c-d844-482a-ae04-85045e83a1b7/console/0.log" Apr 22 19:52:15.920951 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:15.920928 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7fbkv_df2750ad-dfcb-4f8d-ad05-8e76b5f74f48/download-server/0.log" Apr 22 19:52:16.265495 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.265419 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n"] Apr 22 19:52:16.265800 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.265788 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42bba0ec-b46b-495c-ab02-c387ba33a97a" containerName="console" Apr 22 19:52:16.265845 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.265802 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bba0ec-b46b-495c-ab02-c387ba33a97a" containerName="console" Apr 22 19:52:16.265884 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.265854 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="42bba0ec-b46b-495c-ab02-c387ba33a97a" containerName="console" Apr 22 19:52:16.268896 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.268876 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.271557 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.271536 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8tvdg\"/\"default-dockercfg-ksxp9\"" Apr 22 19:52:16.271668 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.271537 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8tvdg\"/\"openshift-service-ca.crt\"" Apr 22 19:52:16.272558 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.272543 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8tvdg\"/\"kube-root-ca.crt\"" Apr 22 19:52:16.278063 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.278039 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n"] Apr 22 19:52:16.379327 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.379293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-podres\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.379493 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.379337 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-lib-modules\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.379493 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.379373 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-sys\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.379493 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.379391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pls4g\" (UniqueName: \"kubernetes.io/projected/0d078581-91ca-481c-ac48-43eb8c0ac00b-kube-api-access-pls4g\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.379493 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.379414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-proc\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480627 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480597 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pls4g\" (UniqueName: \"kubernetes.io/projected/0d078581-91ca-481c-ac48-43eb8c0ac00b-kube-api-access-pls4g\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-proc\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-podres\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480703 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-lib-modules\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-sys\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-sys\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-proc\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-podres\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.480992 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.480935 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d078581-91ca-481c-ac48-43eb8c0ac00b-lib-modules\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.491537 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.491512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pls4g\" (UniqueName: \"kubernetes.io/projected/0d078581-91ca-481c-ac48-43eb8c0ac00b-kube-api-access-pls4g\") pod \"perf-node-gather-daemonset-w285n\" (UID: \"0d078581-91ca-481c-ac48-43eb8c0ac00b\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.579714 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.579648 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:16.907882 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.907849 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n"] Apr 22 19:52:16.910957 ip-10-0-133-163 kubenswrapper[2570]: W0422 19:52:16.910928 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0d078581_91ca_481c_ac48_43eb8c0ac00b.slice/crio-291dcb5cfb3b21445cffc555ce161f93cbea543fd7fd3c67a7b1412e83e91028 WatchSource:0}: Error finding container 291dcb5cfb3b21445cffc555ce161f93cbea543fd7fd3c67a7b1412e83e91028: Status 404 returned error can't find the container with id 291dcb5cfb3b21445cffc555ce161f93cbea543fd7fd3c67a7b1412e83e91028 Apr 22 19:52:16.912585 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:16.912560 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:52:17.178426 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.178348 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" event={"ID":"0d078581-91ca-481c-ac48-43eb8c0ac00b","Type":"ContainerStarted","Data":"25c9d252925c13e3e074c865681d6f87e03e247a586044dad084cff16f570769"} Apr 22 19:52:17.178426 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.178383 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" event={"ID":"0d078581-91ca-481c-ac48-43eb8c0ac00b","Type":"ContainerStarted","Data":"291dcb5cfb3b21445cffc555ce161f93cbea543fd7fd3c67a7b1412e83e91028"} Apr 22 19:52:17.178601 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.178471 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:17.196603 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.196559 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" podStartSLOduration=1.196547507 podStartE2EDuration="1.196547507s" podCreationTimestamp="2026-04-22 19:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:52:17.194304797 +0000 UTC m=+3933.525813248" watchObservedRunningTime="2026-04-22 19:52:17.196547507 +0000 UTC m=+3933.528055959" Apr 22 19:52:17.241484 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.241458 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6zgnq_030083d5-b0a7-4438-a7f6-06eae3c80777/dns/0.log" Apr 22 19:52:17.264809 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.264778 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6zgnq_030083d5-b0a7-4438-a7f6-06eae3c80777/kube-rbac-proxy/0.log" Apr 22 19:52:17.317728 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.317707 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cpq2l_e190ea4b-606c-4dd3-9785-eb0178af92e9/dns-node-resolver/0.log" Apr 22 19:52:17.800627 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.800585 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-558fc6cd4f-b58xq_f24cbaf9-5758-415a-8259-120faaae9cf8/registry/0.log" Apr 22 19:52:17.874473 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:17.874441 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lxsx5_0484c97f-ca09-4491-bd36-1cd68e364f27/node-ca/0.log" Apr 22 19:52:19.086752 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:19.086702 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n4k5q_b72ffd58-b0b5-48d0-b56c-8d4e7f30307b/serve-healthcheck-canary/0.log" Apr 22 19:52:19.528580 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:19.528554 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gntnd_3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b/kube-rbac-proxy/0.log" Apr 22 19:52:19.556223 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:19.556200 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gntnd_3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b/exporter/0.log" Apr 22 19:52:19.585403 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:19.585381 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gntnd_3e06d61a-9ef7-4d70-b8ad-26b1ca39ab7b/extractor/0.log" Apr 22 19:52:21.851552 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:21.851517 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-pzqhj_cd482c2d-ea55-4741-bb2e-ade54e49f678/manager/0.log" Apr 22 19:52:21.875331 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:21.875306 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-pfj6v_0ae9a00d-a015-437f-8f4b-bef203a3c1e7/manager/0.log" Apr 22 19:52:23.192414 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:23.192347 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-w285n" Apr 22 19:52:27.059669 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:27.059632 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-bd5pc_fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3/kube-storage-version-migrator-operator/1.log" Apr 22 19:52:27.060531 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:27.060512 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-bd5pc_fe75fb92-d6fe-47f7-96e0-d03ff3a8acc3/kube-storage-version-migrator-operator/0.log" Apr 22 19:52:28.021326 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.021217 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-47h8f_06ac6726-1f7b-4981-a05c-4538095c85b7/kube-multus/0.log" Apr 22 19:52:28.055505 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.055481 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c74jc_94116840-a7e4-4953-83a7-56e00b343c31/kube-multus-additional-cni-plugins/0.log" Apr 22 19:52:28.080415 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.080394 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c74jc_94116840-a7e4-4953-83a7-56e00b343c31/egress-router-binary-copy/0.log" Apr 22 19:52:28.104970 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.104947 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c74jc_94116840-a7e4-4953-83a7-56e00b343c31/cni-plugins/0.log" Apr 22 19:52:28.129474 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.129454 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c74jc_94116840-a7e4-4953-83a7-56e00b343c31/bond-cni-plugin/0.log" Apr 22 19:52:28.153077 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.153060 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c74jc_94116840-a7e4-4953-83a7-56e00b343c31/routeoverride-cni/0.log" Apr 22 19:52:28.178607 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.178587 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c74jc_94116840-a7e4-4953-83a7-56e00b343c31/whereabouts-cni-bincopy/0.log" Apr 22 19:52:28.204808 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.204791 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c74jc_94116840-a7e4-4953-83a7-56e00b343c31/whereabouts-cni/0.log" Apr 22 19:52:28.674987 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.674958 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mf94f_383fe532-b742-451a-8a94-dc5c7fd3fce5/network-metrics-daemon/0.log" Apr 22 19:52:28.700409 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:28.700389 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mf94f_383fe532-b742-451a-8a94-dc5c7fd3fce5/kube-rbac-proxy/0.log" Apr 22 19:52:30.378919 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.378892 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-controller/0.log" Apr 22 19:52:30.407227 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.407198 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/0.log" Apr 22 19:52:30.424062 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.424040 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovn-acl-logging/1.log" Apr 22 19:52:30.448064 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.448042 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/kube-rbac-proxy-node/0.log" Apr 22 19:52:30.473298 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.473262 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:52:30.497990 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.497972 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/northd/0.log" Apr 22 19:52:30.527858 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.527836 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/nbdb/0.log" Apr 22 19:52:30.555434 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.555413 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/sbdb/0.log" Apr 22 19:52:30.657794 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:30.657733 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tlrd2_4f7c20eb-284f-4276-b58c-ed5062c1325e/ovnkube-controller/0.log" Apr 22 19:52:31.634128 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:31.634089 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-98fl9_e6379d65-65a4-43ce-90b3-22b4af6360dc/check-endpoints/0.log" Apr 22 19:52:31.659225 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:31.659201 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-57n8t_3dff91e7-4af5-48c7-992c-03ed3b2b6c0b/network-check-target-container/0.log" Apr 22 19:52:32.705727 ip-10-0-133-163 kubenswrapper[2570]: I0422 19:52:32.705688 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wcprf_587139e8-f488-4657-8806-34d257b2339c/iptables-alerter/0.log"