Apr 24 22:27:41.082070 ip-10-0-134-101 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 22:27:41.082082 ip-10-0-134-101 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 22:27:41.082092 ip-10-0-134-101 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 22:27:41.082392 ip-10-0-134-101 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 22:27:51.184290 ip-10-0-134-101 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 22:27:51.184307 ip-10-0-134-101 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot afcd7859d2ec4b96a38a29c085423ee3 -- Apr 24 22:30:12.586194 ip-10-0-134-101 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:30:13.013346 ip-10-0-134-101 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:13.013346 ip-10-0-134-101 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:30:13.013346 ip-10-0-134-101 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:13.013346 ip-10-0-134-101 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:30:13.013346 ip-10-0-134-101 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:13.015025 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.014939 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:30:13.020494 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020478 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:13.020494 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020494 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020500 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020504 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020508 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020511 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020514 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020516 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020520 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020522 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020525 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020528 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020530 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020533 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020535 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020538 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020541 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020543 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020554 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020556 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:13.020564 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020559 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020562 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020564 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020567 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020570 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020572 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020575 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020578 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020581 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020584 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020587 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020589 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020592 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020594 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020597 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020599 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020602 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020605 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020609 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:13.021025 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020612 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020614 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020617 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020619 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020622 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020625 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020627 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020630 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020633 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020635 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020638 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020640 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020643 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020646 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020649 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020652 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020654 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020657 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020660 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:13.021507 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020662 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020665 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020667 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020669 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020672 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020674 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020677 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020679 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020682 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020685 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020687 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020690 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020692 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020695 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020697 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020699 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020702 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020704 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020707 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020709 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:13.021983 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020713 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020717 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020719 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020722 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020725 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020727 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020731 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.020733 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021140 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021145 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021148 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021151 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021154 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021157 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021159 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021162 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021165 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021168 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021170 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021173 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:13.022455 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021176 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021179 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021183 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021185 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021188 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021191 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021193 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021196 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021198 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021201 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021204 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021206 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021209 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021212 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021214 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021216 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021219 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021221 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021225 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:13.022950 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021227 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021235 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021238 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021241 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021244 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021246 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021249 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021252 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021254 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021257 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021259 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021262 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021264 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021267 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021269 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021272 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021275 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021277 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021280 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021284 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:13.023415 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021286 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021289 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021291 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021293 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021296 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021299 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021302 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021304 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021307 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021309 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021312 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021314 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021317 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021323 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021326 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021329 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021332 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021335 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021338 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:13.023914 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021341 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021343 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021346 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021348 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021351 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021353 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021356 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021358 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021361 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021364 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021366 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021369 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021371 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021373 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021376 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.021378 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022597 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022606 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022614 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022619 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022623 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:30:13.024383 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022626 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022631 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022637 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022640 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022643 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022647 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022651 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022654 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022657 2577 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022659 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022662 2577 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022665 2577 flags.go:64] FLAG: --cloud-config="" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022668 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022670 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022675 2577 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022677 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022680 2577 flags.go:64] FLAG: --config-dir="" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022683 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022686 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022690 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022693 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022695 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022699 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022701 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:30:13.024901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022704 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022707 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022710 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022713 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022717 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022721 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022724 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022727 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022730 2577 flags.go:64] FLAG: --enable-server="true" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022733 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022737 2577 flags.go:64] FLAG: --event-burst="100" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022740 2577 flags.go:64] FLAG: --event-qps="50" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022757 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022760 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022764 2577 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022768 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022770 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022774 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022777 2577 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022779 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022782 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022785 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022788 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022791 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022794 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:30:13.025472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022797 2577 flags.go:64] FLAG: --feature-gates="" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022801 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022804 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022807 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022810 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022813 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022816 2577 flags.go:64] FLAG: --help="false" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022819 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022822 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022825 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022828 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022831 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022835 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022838 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022841 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022844 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022847 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022850 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022853 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022855 2577 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022858 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022861 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022865 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022867 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:30:13.026101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022870 2577 flags.go:64] FLAG: --lock-file="" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022873 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022876 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022879 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022884 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022887 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022890 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022892 2577 flags.go:64] FLAG: --logging-format="text" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022895 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022898 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022901 2577 flags.go:64] FLAG: --manifest-url="" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022904 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022909 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022911 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022915 2577 flags.go:64] FLAG: --max-pods="110" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022918 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022921 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022924 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022927 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022930 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022933 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022936 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022943 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022946 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022949 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:30:13.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022953 2577 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022956 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022962 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022965 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022968 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022971 2577 flags.go:64] FLAG: --port="10250" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022974 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022977 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b150f155904d61be" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022980 2577 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022983 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022986 2577 flags.go:64] FLAG: --register-node="true" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022989 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022992 2577 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022995 2577 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.022998 2577 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023001 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023003 2577 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023007 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023010 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023014 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023017 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023020 2577 flags.go:64] FLAG: --runonce="false" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023022 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023025 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023028 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:30:13.027306 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023031 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023034 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023036 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023039 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023042 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023045 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023048 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023050 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023053 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023056 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023059 2577 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023064 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023070 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023073 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023076 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023080 2577 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023083 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023086 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023089 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023092 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023094 2577 flags.go:64] FLAG: --v="2" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023099 2577 flags.go:64] FLAG: --version="false" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023102 2577 flags.go:64] FLAG: --vmodule="" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023107 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:30:13.027965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.023110 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023220 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023224 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023228 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023232 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023234 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023238 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023241 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023243 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023247 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023251 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023254 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023257 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023260 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023262 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023265 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023267 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023270 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023272 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:13.028534 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023277 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023279 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023282 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023285 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023288 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023292 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023295 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023298 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023301 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023304 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023306 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023309 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023312 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023314 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023318 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023320 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023323 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023325 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023328 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:13.029044 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023330 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023333 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023335 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023338 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023340 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023343 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023345 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023348 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023350 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023353 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023355 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023358 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023360 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023364 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023366 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023369 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023372 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023374 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023377 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023380 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:13.029580 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023383 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023385 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023388 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023390 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023393 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023395 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023398 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023401 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023404 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023406 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023409 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023411 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023414 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023416 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023419 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023422 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023425 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023428 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023430 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023433 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:13.030204 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023435 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023437 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023440 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023443 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023445 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023449 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023451 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023454 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.023456 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:13.030936 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.024332 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:13.032169 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.032146 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:30:13.032169 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.032169 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032238 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032246 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032251 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032257 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032261 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032265 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032270 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032274 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032279 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032283 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032296 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032300 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032304 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032308 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032313 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032318 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032322 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032327 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032331 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:13.032333 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032335 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032339 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032343 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032347 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032351 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032357 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032362 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032367 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032372 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032376 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032380 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032385 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032389 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032393 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032396 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032400 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032404 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032408 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032412 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:13.033223 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032419 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032425 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032430 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032435 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032439 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032442 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032447 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032451 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032456 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032460 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032465 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032469 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032498 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032506 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032510 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032516 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032520 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032525 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032530 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032535 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:13.033891 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032539 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032543 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032547 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032552 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032557 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032561 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032565 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032569 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032573 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032577 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032582 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032586 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032590 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032594 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032598 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032602 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032606 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032610 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032616 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:13.034442 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032621 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032625 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032629 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032633 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032637 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032642 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032646 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032650 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032655 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.032663 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032860 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032870 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032875 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032880 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032885 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:13.035121 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032890 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032895 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032900 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032904 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032909 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032913 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032917 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032921 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032925 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032929 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032934 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032938 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032942 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032946 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032950 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032954 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032958 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032964 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032968 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032973 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032977 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:13.035799 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032981 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032988 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032994 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.032999 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033004 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033009 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033014 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033018 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033023 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033027 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033032 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033037 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033041 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033046 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033050 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033054 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033058 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033062 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033066 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:13.036577 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033070 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033075 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033079 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033083 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033087 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033090 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033095 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033099 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033102 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033107 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033113 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033118 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033122 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033126 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033130 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033134 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033138 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033142 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033147 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:13.037188 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033151 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033155 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033160 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033164 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033168 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033172 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033176 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033180 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033183 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033188 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033192 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033198 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033204 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033208 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033212 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033216 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033220 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033224 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033227 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033231 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:13.037650 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033235 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:13.038315 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:13.033239 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:13.038315 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.033247 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:13.038315 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.034105 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:30:13.038315 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.036945 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:30:13.038315 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.037974 2577 server.go:1019] "Starting client certificate rotation" Apr 24 22:30:13.038315 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.038075 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:30:13.038315 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.038109 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:30:13.061592 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.061564 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:30:13.066171 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.066151 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:30:13.080411 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.080388 2577 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:30:13.089255 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.089238 2577 log.go:25] "Validated CRI v1 image API" Apr 24 22:30:13.090498 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.090485 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:30:13.097464 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.097442 2577 fs.go:135] Filesystem UUIDs: map[0a1332cc-22ff-427f-b65e-2004cb4d1764:/dev/nvme0n1p3 69819b05-53a7-4774-aef3-5026b184da53:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 22:30:13.097528 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.097465 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:30:13.102919 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.102815 2577 manager.go:217] Machine: {Timestamp:2026-04-24 22:30:13.101570915 +0000 UTC m=+0.396917150 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098152 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ba25588c2cdc879e501021b05b54b SystemUUID:ec2ba255-88c2-cdc8-79e5-01021b05b54b BootID:afcd7859-d2ec-4b96-a38a-29c085423ee3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:47:e5:b7:43:91 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:47:e5:b7:43:91 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:d2:66:5c:b9:ea Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:30:13.102919 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.102915 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:30:13.103031 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.103019 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:30:13.104143 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.104122 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:30:13.104274 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.104145 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-101.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:30:13.104330 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.104283 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:30:13.104330 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.104301 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:30:13.104330 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.104314 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:30:13.104444 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.104329 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:30:13.105666 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.105653 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:30:13.105801 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.105792 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:30:13.108021 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.108009 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:30:13.108021 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.108024 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:30:13.108620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.108611 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:30:13.108655 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.108633 2577 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:30:13.108655 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.108643 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:30:13.109651 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.109630 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:30:13.109802 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.109778 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:30:13.109892 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.109807 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:30:13.112977 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.112961 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:30:13.114965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.114941 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:30:13.116550 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116533 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:30:13.116602 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116563 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:30:13.116602 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116579 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:30:13.116602 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116591 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:30:13.116684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116603 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:30:13.116684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116614 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:30:13.116684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116625 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:30:13.116684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116637 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:30:13.116684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116651 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:30:13.116684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116664 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:30:13.116846 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116690 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:30:13.116846 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.116707 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:30:13.117503 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.117493 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:30:13.117535 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.117505 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:30:13.121234 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.121220 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:30:13.121301 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.121260 2577 server.go:1295] "Started kubelet" Apr 24 22:30:13.121382 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.121353 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:30:13.121972 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.121929 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:30:13.122026 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.121992 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:30:13.122090 ip-10-0-134-101 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:30:13.123344 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.123318 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:30:13.123816 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.123803 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:30:13.132503 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.132484 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:30:13.133099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.133042 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:30:13.133255 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.133186 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:30:13.133844 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.133810 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.133930 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.133900 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:30:13.133983 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.133958 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-101.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:30:13.134032 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.133991 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:30:13.134032 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134003 2577 factory.go:55] Registering systemd factory Apr 24 22:30:13.134032 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134015 2577 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:30:13.134334 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.134042 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:30:13.134334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134071 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:30:13.134334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134082 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:30:13.134334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134230 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:30:13.134334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134241 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:30:13.134334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134325 2577 factory.go:153] Registering CRI-O factory Apr 24 22:30:13.134597 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134341 2577 factory.go:223] Registration of the crio container factory successfully Apr 24 22:30:13.134597 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134372 2577 factory.go:103] Registering Raw factory Apr 24 22:30:13.134597 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134422 2577 manager.go:1196] Started watching for new ooms in manager Apr 24 22:30:13.134597 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.134485 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-101.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:30:13.134812 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.134779 2577 manager.go:319] Starting recovery of all containers Apr 24 22:30:13.135081 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.133939 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-101.ec2.internal.18a96b9468292cae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-101.ec2.internal,UID:ip-10-0-134-101.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-101.ec2.internal,},FirstTimestamp:2026-04-24 22:30:13.121232046 +0000 UTC m=+0.416578277,LastTimestamp:2026-04-24 22:30:13.121232046 +0000 UTC m=+0.416578277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-101.ec2.internal,}" Apr 24 22:30:13.144103 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.144078 2577 manager.go:324] Recovery completed Apr 24 22:30:13.148418 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.148403 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:13.151132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.151115 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:13.151215 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.151147 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:13.151215 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.151162 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:13.151633 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.151618 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:30:13.151633 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.151628 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:30:13.151721 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.151644 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:30:13.153984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.153972 2577 policy_none.go:49] "None policy: Start" Apr 24 22:30:13.154037 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.153990 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:30:13.154037 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.154007 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:30:13.156914 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.156842 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-101.ec2.internal.18a96b9469f162c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-101.ec2.internal,UID:ip-10-0-134-101.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-101.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-101.ec2.internal,},FirstTimestamp:2026-04-24 22:30:13.151130306 +0000 UTC m=+0.446476541,LastTimestamp:2026-04-24 22:30:13.151130306 +0000 UTC m=+0.446476541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-101.ec2.internal,}" Apr 24 22:30:13.156998 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.156956 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-101.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:30:13.156998 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.156955 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:30:13.167148 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.167083 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-101.ec2.internal.18a96b9469f1ba70 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-101.ec2.internal,UID:ip-10-0-134-101.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-101.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-101.ec2.internal,},FirstTimestamp:2026-04-24 22:30:13.151152752 +0000 UTC m=+0.446498985,LastTimestamp:2026-04-24 22:30:13.151152752 +0000 UTC m=+0.446498985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-101.ec2.internal,}" Apr 24 22:30:13.170302 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.170246 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-101.ec2.internal.18a96b9469f1f16a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-101.ec2.internal,UID:ip-10-0-134-101.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-101.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-101.ec2.internal,},FirstTimestamp:2026-04-24 22:30:13.151166826 +0000 UTC m=+0.446513057,LastTimestamp:2026-04-24 22:30:13.151166826 +0000 UTC m=+0.446513057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-101.ec2.internal,}" Apr 24 22:30:13.191345 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.191329 2577 manager.go:341] "Starting Device Plugin manager" Apr 24 22:30:13.191441 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.191366 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:30:13.191441 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.191379 2577 server.go:85] "Starting device plugin registration server" Apr 24 22:30:13.191629 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.191614 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:30:13.191727 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.191639 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:30:13.191796 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.191776 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:30:13.194137 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.192077 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:30:13.194137 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.192095 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:30:13.194137 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.193076 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:30:13.194137 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.193136 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.197254 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.197192 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-101.ec2.internal.18a96b946c8ca55c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-101.ec2.internal,UID:ip-10-0-134-101.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-134-101.ec2.internal,},FirstTimestamp:2026-04-24 22:30:13.194859868 +0000 UTC m=+0.490206093,LastTimestamp:2026-04-24 22:30:13.194859868 +0000 UTC m=+0.490206093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-101.ec2.internal,}" Apr 24 22:30:13.219737 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.219716 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jpsx9" Apr 24 22:30:13.228164 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.228146 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jpsx9" Apr 24 22:30:13.257591 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.257562 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:30:13.258963 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.258939 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:30:13.258963 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.258963 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:30:13.259100 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.258980 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:30:13.259100 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.258987 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:30:13.259100 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.259054 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:30:13.269385 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.269338 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:13.292311 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.292280 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:13.293110 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.293094 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:13.293215 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.293124 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:13.293215 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.293134 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:13.293215 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.293156 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.300973 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.300958 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.301014 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.300981 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-101.ec2.internal\": node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.318670 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.318649 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.359666 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.359641 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal"] Apr 24 22:30:13.359780 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.359709 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:13.360525 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.360509 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:13.360596 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.360537 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:13.360596 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.360548 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:13.361980 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.361968 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:13.362113 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.362163 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362126 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:13.362645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362626 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:13.362741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362634 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:13.362741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362674 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:13.362741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362684 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:13.362741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362654 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:13.362741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.362722 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:13.364132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.364117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.364180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.364151 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:13.364764 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.364731 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:13.364815 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.364775 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:13.364815 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.364787 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:13.395282 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.395262 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-101.ec2.internal\" not found" node="ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.399464 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.399449 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-101.ec2.internal\" not found" node="ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.419238 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.419211 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.436673 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.436655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/392735cd9fd3420f91a267e1b155d1d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal\" (UID: \"392735cd9fd3420f91a267e1b155d1d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.436769 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.436681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/392735cd9fd3420f91a267e1b155d1d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal\" (UID: \"392735cd9fd3420f91a267e1b155d1d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.436769 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.436696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1067ddb593414b000288d5e34ffef8f7-config\") pod \"kube-apiserver-proxy-ip-10-0-134-101.ec2.internal\" (UID: \"1067ddb593414b000288d5e34ffef8f7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.519699 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.519615 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.537428 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.537405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1067ddb593414b000288d5e34ffef8f7-config\") pod \"kube-apiserver-proxy-ip-10-0-134-101.ec2.internal\" (UID: \"1067ddb593414b000288d5e34ffef8f7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.537501 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.537434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/392735cd9fd3420f91a267e1b155d1d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal\" (UID: \"392735cd9fd3420f91a267e1b155d1d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.537501 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.537452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/392735cd9fd3420f91a267e1b155d1d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal\" (UID: \"392735cd9fd3420f91a267e1b155d1d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.537569 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.537501 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/392735cd9fd3420f91a267e1b155d1d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal\" (UID: \"392735cd9fd3420f91a267e1b155d1d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.537569 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.537498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1067ddb593414b000288d5e34ffef8f7-config\") pod \"kube-apiserver-proxy-ip-10-0-134-101.ec2.internal\" (UID: \"1067ddb593414b000288d5e34ffef8f7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.537569 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.537521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/392735cd9fd3420f91a267e1b155d1d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal\" (UID: \"392735cd9fd3420f91a267e1b155d1d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.620641 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.620616 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.697760 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.697729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.701687 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:13.701558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" Apr 24 22:30:13.721412 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.721393 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.822091 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.821997 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:13.922494 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:13.922465 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:14.023057 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:14.023027 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:14.038333 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.038311 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:30:14.038461 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.038446 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:30:14.060327 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.060306 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:14.123617 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:14.123574 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:14.132801 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.132772 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:30:14.155242 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.155210 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:30:14.190689 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.190659 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-c5bmf" Apr 24 22:30:14.199967 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.199944 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-c5bmf" Apr 24 22:30:14.202696 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:14.202657 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1067ddb593414b000288d5e34ffef8f7.slice/crio-3c254a5ca9441a80ad24dbe63e240f9888db64bc35ac95b60df72c49abda8a23 WatchSource:0}: Error finding container 3c254a5ca9441a80ad24dbe63e240f9888db64bc35ac95b60df72c49abda8a23: Status 404 returned error can't find the container with id 3c254a5ca9441a80ad24dbe63e240f9888db64bc35ac95b60df72c49abda8a23 Apr 24 22:30:14.203218 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:14.203190 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392735cd9fd3420f91a267e1b155d1d3.slice/crio-72d77175f7ebd49b546becbc5ec4ab4167388181cdbf3584a50702f3589d52c9 WatchSource:0}: Error finding container 72d77175f7ebd49b546becbc5ec4ab4167388181cdbf3584a50702f3589d52c9: Status 404 returned error can't find the container with id 72d77175f7ebd49b546becbc5ec4ab4167388181cdbf3584a50702f3589d52c9 Apr 24 22:30:14.208410 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.208394 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:30:14.223940 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:14.223920 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-101.ec2.internal\" not found" Apr 24 22:30:14.230259 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.230235 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:25:13 +0000 UTC" deadline="2028-01-28 14:01:01.131714322 +0000 UTC" Apr 24 22:30:14.230259 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.230259 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15447h30m46.901458331s" Apr 24 22:30:14.230561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.230549 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:14.233916 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.233904 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" Apr 24 22:30:14.261000 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.260976 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:30:14.261908 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.261865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" event={"ID":"392735cd9fd3420f91a267e1b155d1d3","Type":"ContainerStarted","Data":"72d77175f7ebd49b546becbc5ec4ab4167388181cdbf3584a50702f3589d52c9"} Apr 24 22:30:14.262035 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.262018 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" Apr 24 22:30:14.262763 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.262727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" event={"ID":"1067ddb593414b000288d5e34ffef8f7","Type":"ContainerStarted","Data":"3c254a5ca9441a80ad24dbe63e240f9888db64bc35ac95b60df72c49abda8a23"} Apr 24 22:30:14.281446 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.281427 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:30:14.470602 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:14.470573 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:15.110961 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.110929 2577 apiserver.go:52] "Watching apiserver" Apr 24 22:30:15.119093 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.119062 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:30:15.119466 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.119445 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-g2bql","kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm","openshift-dns/node-resolver-4v6p7","openshift-image-registry/node-ca-rp5x6","openshift-multus/multus-kbktf","openshift-network-diagnostics/network-check-target-2jhbl","openshift-ovn-kubernetes/ovnkube-node-bbfmd","kube-system/konnectivity-agent-gm79c","openshift-cluster-node-tuning-operator/tuned-lk7ql","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal","openshift-multus/multus-additional-cni-plugins-l8lg2","openshift-multus/network-metrics-daemon-l8hdc","openshift-network-operator/iptables-alerter-j48ng"] Apr 24 22:30:15.122241 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.122218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:15.122347 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.122331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:15.122347 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.122330 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:15.122454 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.122397 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:15.123548 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.123524 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.125249 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.125226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.126858 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.126826 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.127007 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.126988 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-js2l7\"" Apr 24 22:30:15.127219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.127179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.127600 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.127582 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:30:15.128054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.128041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.129291 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.129263 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.130288 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.130268 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.132127 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.132104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.132669 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.132651 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.133224 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.132984 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.133224 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.133069 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-f8tbk\"" Apr 24 22:30:15.133224 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.133156 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ppgwm\"" Apr 24 22:30:15.133414 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.133291 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.133414 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.133294 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:30:15.133579 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.133563 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:30:15.134074 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.134056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.134617 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.134280 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.135461 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.135441 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.137024 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.137005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.138475 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.138458 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.138564 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.138516 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:15.140167 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.140147 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:30:15.141527 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.141508 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.142939 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.142921 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.142939 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.142934 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:30:15.143072 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.142945 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:30:15.143186 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143171 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.143228 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143186 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:15.143902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143732 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-cni-bin\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.143902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-cni-multus\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.143902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0d65ad29-817e-44aa-888e-73e4eb91b70d-konnectivity-ca\") pod \"konnectivity-agent-gm79c\" (UID: \"0d65ad29-817e-44aa-888e-73e4eb91b70d\") " pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.143902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-env-overrides\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.143902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92fe7641-e2dc-499a-a25d-09cdcdac368b-hosts-file\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.143902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-etc-kubernetes\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-kubelet\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-run-ovn-kubernetes\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143969 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfm27\" (UniqueName: \"kubernetes.io/projected/0466857c-9575-492e-9148-290f37031549-kube-api-access-dfm27\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.143995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ea69dcf-def3-4c54-b774-dad54e40bced-serviceca\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0d65ad29-817e-44aa-888e-73e4eb91b70d-agent-certs\") pod \"konnectivity-agent-gm79c\" (UID: \"0d65ad29-817e-44aa-888e-73e4eb91b70d\") " pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-socket-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-cnibin\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144107 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xslhz\" (UniqueName: \"kubernetes.io/projected/429121c4-84dd-4766-8b1d-bdc0550cffd5-kube-api-access-xslhz\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a00368dd-8532-419e-919b-66cb1cf3e0c9-iptables-alerter-script\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-etc-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-log-socket\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.144180 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-etc-selinux\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-cni-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-socket-dir-parent\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-conf-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-daemon-config\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-cni-netd\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhllr\" (UniqueName: \"kubernetes.io/projected/a41baed3-bda8-4fa8-a15e-3fdd5f955169-kube-api-access-dhllr\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-device-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ea69dcf-def3-4c54-b774-dad54e40bced-host\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.144646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144640 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-os-release\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovnkube-config\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-system-cni-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144735 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-run-netns\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144773 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-registration-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-systemd\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfw5g\" (UniqueName: \"kubernetes.io/projected/92fe7641-e2dc-499a-a25d-09cdcdac368b-kube-api-access-tfw5g\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.144970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/429121c4-84dd-4766-8b1d-bdc0550cffd5-cni-binary-copy\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-ovn\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhl7\" (UniqueName: \"kubernetes.io/projected/9ea69dcf-def3-4c54-b774-dad54e40bced-kube-api-access-kdhl7\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovn-node-metrics-cert\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr55n\" (UniqueName: \"kubernetes.io/projected/a27aa799-5d01-4cf6-8ad8-43d20950172c-kube-api-access-tr55n\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92fe7641-e2dc-499a-a25d-09cdcdac368b-tmp-dir\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-systemd-units\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-node-log\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-sys-fs\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-netns\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-multus-certs\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlxhn\" (UniqueName: \"kubernetes.io/projected/a00368dd-8532-419e-919b-66cb1cf3e0c9-kube-api-access-qlxhn\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-cni-bin\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovnkube-script-lib\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145336 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-kubelet\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-hostroot\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a00368dd-8532-419e-919b-66cb1cf3e0c9-host-slash\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145451 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-slash\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-var-lib-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145527 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-k8s-cni-cncf-io\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145590 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145896 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.145942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145933 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-44l7b\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145975 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.145993 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zdncl\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146054 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146054 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146121 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146128 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146180 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146255 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dzwh4\"" Apr 24 22:30:15.146439 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146363 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:15.146791 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146517 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kfp54\"" Apr 24 22:30:15.146791 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146519 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n8x2b\"" Apr 24 22:30:15.146791 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146739 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-87hfw\"" Apr 24 22:30:15.146938 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.146802 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:30:15.201618 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.201588 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:25:14 +0000 UTC" deadline="2027-12-26 02:37:35.799888957 +0000 UTC" Apr 24 22:30:15.201618 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.201615 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14644h7m20.59827674s" Apr 24 22:30:15.235159 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.235126 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:30:15.246553 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246521 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-cnibin\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xslhz\" (UniqueName: \"kubernetes.io/projected/429121c4-84dd-4766-8b1d-bdc0550cffd5-kube-api-access-xslhz\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a00368dd-8532-419e-919b-66cb1cf3e0c9-iptables-alerter-script\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-etc-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-log-socket\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-etc-selinux\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-cnibin\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.246709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-cni-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-socket-dir-parent\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-cni-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-conf-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-daemon-config\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-cni-netd\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhllr\" (UniqueName: \"kubernetes.io/projected/a41baed3-bda8-4fa8-a15e-3fdd5f955169-kube-api-access-dhllr\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-device-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.246977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ea69dcf-def3-4c54-b774-dad54e40bced-host\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysconfig\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-lib-modules\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-os-release\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.247105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovnkube-config\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-system-cni-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-run-netns\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a00368dd-8532-419e-919b-66cb1cf3e0c9-iptables-alerter-script\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-var-lib-kubelet\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-system-cni-dir\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-cnibin\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247393 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-etc-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-log-socket\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-kubelet-config\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-registration-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.247619 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-systemd\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247533 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-etc-selinux\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysctl-d\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247587 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfw5g\" (UniqueName: \"kubernetes.io/projected/92fe7641-e2dc-499a-a25d-09cdcdac368b-kube-api-access-tfw5g\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/429121c4-84dd-4766-8b1d-bdc0550cffd5-cni-binary-copy\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-socket-dir-parent\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-ovn\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhl7\" (UniqueName: \"kubernetes.io/projected/9ea69dcf-def3-4c54-b774-dad54e40bced-kube-api-access-kdhl7\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-conf-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-systemd\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-ovn\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-sys\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-system-cni-dir\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-systemd\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.247994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-os-release\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/429121c4-84dd-4766-8b1d-bdc0550cffd5-multus-daemon-config\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.248438 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/429121c4-84dd-4766-8b1d-bdc0550cffd5-cni-binary-copy\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ea69dcf-def3-4c54-b774-dad54e40bced-host\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3b8675c-f9be-454b-aba6-002550b72a84-etc-tuned\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-cni-netd\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-run-netns\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddlhd\" (UniqueName: \"kubernetes.io/projected/a3b8675c-f9be-454b-aba6-002550b72a84-kube-api-access-ddlhd\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248382 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-device-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovn-node-metrics-cert\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248448 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-registration-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr55n\" (UniqueName: \"kubernetes.io/projected/a27aa799-5d01-4cf6-8ad8-43d20950172c-kube-api-access-tr55n\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovnkube-config\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92fe7641-e2dc-499a-a25d-09cdcdac368b-tmp-dir\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-systemd-units\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-node-log\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-node-log\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248767 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-systemd-units\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-modprobe-d\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysctl-conf\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.249219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248819 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-sys-fs\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92fe7641-e2dc-499a-a25d-09cdcdac368b-tmp-dir\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-netns\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-multus-certs\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-multus-certs\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlxhn\" (UniqueName: \"kubernetes.io/projected/a00368dd-8532-419e-919b-66cb1cf3e0c9-kube-api-access-qlxhn\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248977 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-netns\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.248955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-sys-fs\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249004 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-cni-bin\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovnkube-script-lib\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-cni-bin\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3b8675c-f9be-454b-aba6-002550b72a84-tmp\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-dbus\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-kubelet\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-hostroot\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.249984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a00368dd-8532-419e-919b-66cb1cf3e0c9-host-slash\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-kubelet\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-slash\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a00368dd-8532-419e-919b-66cb1cf3e0c9-host-slash\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249211 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-hostroot\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-var-lib-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.249234 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-slash\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-var-lib-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-kubernetes\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-run-openvswitch\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.249361 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.749321652 +0000 UTC m=+3.044667869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-run\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-k8s-cni-cncf-io\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-cni-bin\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-run-k8s-cni-cncf-io\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.250561 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-cni-multus\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0d65ad29-817e-44aa-888e-73e4eb91b70d-konnectivity-ca\") pod \"konnectivity-agent-gm79c\" (UID: \"0d65ad29-817e-44aa-888e-73e4eb91b70d\") " pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-cni-bin\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovnkube-script-lib\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-host-var-lib-cni-multus\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvmwz\" (UniqueName: \"kubernetes.io/projected/4264f6dc-3224-4773-b9ba-8ad8e185093f-kube-api-access-rvmwz\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-env-overrides\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92fe7641-e2dc-499a-a25d-09cdcdac368b-hosts-file\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-etc-kubernetes\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-kubelet\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-run-ovn-kubernetes\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfm27\" (UniqueName: \"kubernetes.io/projected/0466857c-9575-492e-9148-290f37031549-kube-api-access-dfm27\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/429121c4-84dd-4766-8b1d-bdc0550cffd5-etc-kubernetes\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92fe7641-e2dc-499a-a25d-09cdcdac368b-hosts-file\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ea69dcf-def3-4c54-b774-dad54e40bced-serviceca\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0d65ad29-817e-44aa-888e-73e4eb91b70d-agent-certs\") pod \"konnectivity-agent-gm79c\" (UID: \"0d65ad29-817e-44aa-888e-73e4eb91b70d\") " pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-run-ovn-kubernetes\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.251054 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-host\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-os-release\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a41baed3-bda8-4fa8-a15e-3fdd5f955169-host-kubelet\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-socket-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-cni-binary-copy\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.249974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0d65ad29-817e-44aa-888e-73e4eb91b70d-konnectivity-ca\") pod \"konnectivity-agent-gm79c\" (UID: \"0d65ad29-817e-44aa-888e-73e4eb91b70d\") " pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.250078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a27aa799-5d01-4cf6-8ad8-43d20950172c-socket-dir\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.250300 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ea69dcf-def3-4c54-b774-dad54e40bced-serviceca\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.251494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.250733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a41baed3-bda8-4fa8-a15e-3fdd5f955169-env-overrides\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.252402 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.252374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a41baed3-bda8-4fa8-a15e-3fdd5f955169-ovn-node-metrics-cert\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.252508 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.252492 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0d65ad29-817e-44aa-888e-73e4eb91b70d-agent-certs\") pod \"konnectivity-agent-gm79c\" (UID: \"0d65ad29-817e-44aa-888e-73e4eb91b70d\") " pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.270665 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.270632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhllr\" (UniqueName: \"kubernetes.io/projected/a41baed3-bda8-4fa8-a15e-3fdd5f955169-kube-api-access-dhllr\") pod \"ovnkube-node-bbfmd\" (UID: \"a41baed3-bda8-4fa8-a15e-3fdd5f955169\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.270827 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.270694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfw5g\" (UniqueName: \"kubernetes.io/projected/92fe7641-e2dc-499a-a25d-09cdcdac368b-kube-api-access-tfw5g\") pod \"node-resolver-4v6p7\" (UID: \"92fe7641-e2dc-499a-a25d-09cdcdac368b\") " pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.271993 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.271969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xslhz\" (UniqueName: \"kubernetes.io/projected/429121c4-84dd-4766-8b1d-bdc0550cffd5-kube-api-access-xslhz\") pod \"multus-kbktf\" (UID: \"429121c4-84dd-4766-8b1d-bdc0550cffd5\") " pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.272333 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.272312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhl7\" (UniqueName: \"kubernetes.io/projected/9ea69dcf-def3-4c54-b774-dad54e40bced-kube-api-access-kdhl7\") pod \"node-ca-rp5x6\" (UID: \"9ea69dcf-def3-4c54-b774-dad54e40bced\") " pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.273201 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.273173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr55n\" (UniqueName: \"kubernetes.io/projected/a27aa799-5d01-4cf6-8ad8-43d20950172c-kube-api-access-tr55n\") pod \"aws-ebs-csi-driver-node-9djsm\" (UID: \"a27aa799-5d01-4cf6-8ad8-43d20950172c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.273463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.273447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfm27\" (UniqueName: \"kubernetes.io/projected/0466857c-9575-492e-9148-290f37031549-kube-api-access-dfm27\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:15.275810 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.275792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlxhn\" (UniqueName: \"kubernetes.io/projected/a00368dd-8532-419e-919b-66cb1cf3e0c9-kube-api-access-qlxhn\") pod \"iptables-alerter-j48ng\" (UID: \"a00368dd-8532-419e-919b-66cb1cf3e0c9\") " pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.278259 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.278240 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:15.278324 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.278267 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:15.278324 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.278281 2577 projected.go:194] Error preparing data for projected volume kube-api-access-hpzdl for pod openshift-network-diagnostics/network-check-target-2jhbl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:15.278384 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.278374 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl podName:9b1dcaeb-db0d-424b-9f57-c60a54511aa1 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.778353458 +0000 UTC m=+3.073699695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hpzdl" (UniqueName: "kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl") pod "network-check-target-2jhbl" (UID: "9b1dcaeb-db0d-424b-9f57-c60a54511aa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:15.350386 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.350386 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-var-lib-kubelet\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350432 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-system-cni-dir\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-cnibin\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-kubelet-config\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350519 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-var-lib-kubelet\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysctl-d\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-systemd\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.350566 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:15.350620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-sys\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.350643 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret podName:dc6ae891-bdaf-4a23-bf88-ea8e267d1795 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.85062304 +0000 UTC m=+3.145969259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret") pod "global-pull-secret-syncer-g2bql" (UID: "dc6ae891-bdaf-4a23-bf88-ea8e267d1795") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-sys\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-kubelet-config\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350683 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3b8675c-f9be-454b-aba6-002550b72a84-etc-tuned\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-system-cni-dir\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddlhd\" (UniqueName: \"kubernetes.io/projected/a3b8675c-f9be-454b-aba6-002550b72a84-kube-api-access-ddlhd\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-cnibin\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-modprobe-d\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysctl-conf\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-modprobe-d\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-systemd\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3b8675c-f9be-454b-aba6-002550b72a84-tmp\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysctl-d\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351092 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350895 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-dbus\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-kubernetes\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-run\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysctl-conf\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-dbus\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.350986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvmwz\" (UniqueName: \"kubernetes.io/projected/4264f6dc-3224-4773-b9ba-8ad8e185093f-kube-api-access-rvmwz\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351096 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-kubernetes\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-host\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-os-release\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351141 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-run\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-cni-binary-copy\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysconfig\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4264f6dc-3224-4773-b9ba-8ad8e185093f-os-release\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-lib-modules\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-lib-modules\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-etc-sysconfig\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.351924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351710 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-cni-binary-copy\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.352555 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.351829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4264f6dc-3224-4773-b9ba-8ad8e185093f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.352555 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.352087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3b8675c-f9be-454b-aba6-002550b72a84-host\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.353729 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.353705 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3b8675c-f9be-454b-aba6-002550b72a84-tmp\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.354018 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.353991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3b8675c-f9be-454b-aba6-002550b72a84-etc-tuned\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.374984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.374913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvmwz\" (UniqueName: \"kubernetes.io/projected/4264f6dc-3224-4773-b9ba-8ad8e185093f-kube-api-access-rvmwz\") pod \"multus-additional-cni-plugins-l8lg2\" (UID: \"4264f6dc-3224-4773-b9ba-8ad8e185093f\") " pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.375928 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.375907 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddlhd\" (UniqueName: \"kubernetes.io/projected/a3b8675c-f9be-454b-aba6-002550b72a84-kube-api-access-ddlhd\") pod \"tuned-lk7ql\" (UID: \"a3b8675c-f9be-454b-aba6-002550b72a84\") " pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.437808 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.437778 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" Apr 24 22:30:15.446458 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.446430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4v6p7" Apr 24 22:30:15.454799 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.454776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rp5x6" Apr 24 22:30:15.460415 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.460400 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:15.467565 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.467544 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kbktf" Apr 24 22:30:15.474036 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.474017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j48ng" Apr 24 22:30:15.482570 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.482552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:15.488250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.488103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" Apr 24 22:30:15.493726 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.493708 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" Apr 24 22:30:15.753650 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.753552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:15.753822 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.753709 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:15.753822 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.753809 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:16.753787942 +0000 UTC m=+4.049134163 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:15.819386 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.819354 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4264f6dc_3224_4773_b9ba_8ad8e185093f.slice/crio-f7749da58a88050cf2292def14ae35f9b9e5b03e5a87271620d1f4c446e1076e WatchSource:0}: Error finding container f7749da58a88050cf2292def14ae35f9b9e5b03e5a87271620d1f4c446e1076e: Status 404 returned error can't find the container with id f7749da58a88050cf2292def14ae35f9b9e5b03e5a87271620d1f4c446e1076e Apr 24 22:30:15.821365 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.821301 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod429121c4_84dd_4766_8b1d_bdc0550cffd5.slice/crio-49e3890aefd4e92b045553e0ddf4b93c9cfc4c97cdaa0ca2779395bda0221d97 WatchSource:0}: Error finding container 49e3890aefd4e92b045553e0ddf4b93c9cfc4c97cdaa0ca2779395bda0221d97: Status 404 returned error can't find the container with id 49e3890aefd4e92b045553e0ddf4b93c9cfc4c97cdaa0ca2779395bda0221d97 Apr 24 22:30:15.825361 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.825335 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00368dd_8532_419e_919b_66cb1cf3e0c9.slice/crio-41a994a1a1095a1121ef728f81928399de56ed57088b89ab81cd8bd7e3cd9e6d WatchSource:0}: Error finding container 41a994a1a1095a1121ef728f81928399de56ed57088b89ab81cd8bd7e3cd9e6d: Status 404 returned error can't find the container with id 41a994a1a1095a1121ef728f81928399de56ed57088b89ab81cd8bd7e3cd9e6d Apr 24 22:30:15.825941 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.825915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d65ad29_817e_44aa_888e_73e4eb91b70d.slice/crio-f3f907cd50d2f59bda375cd8e944f8da34785ececb2d25133c7774038369ab73 WatchSource:0}: Error finding container f3f907cd50d2f59bda375cd8e944f8da34785ececb2d25133c7774038369ab73: Status 404 returned error can't find the container with id f3f907cd50d2f59bda375cd8e944f8da34785ececb2d25133c7774038369ab73 Apr 24 22:30:15.826583 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.826544 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda41baed3_bda8_4fa8_a15e_3fdd5f955169.slice/crio-0900c43894e9ec5b446aecdb8ecc424e69625d52da61c6ae27ebeb0aadf0a4c3 WatchSource:0}: Error finding container 0900c43894e9ec5b446aecdb8ecc424e69625d52da61c6ae27ebeb0aadf0a4c3: Status 404 returned error can't find the container with id 0900c43894e9ec5b446aecdb8ecc424e69625d52da61c6ae27ebeb0aadf0a4c3 Apr 24 22:30:15.827446 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.827425 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b8675c_f9be_454b_aba6_002550b72a84.slice/crio-ca6ee4a8d81e0d7b4b73d5a23ba79285bf0a78224dc18ed24e1b3a4f12818277 WatchSource:0}: Error finding container ca6ee4a8d81e0d7b4b73d5a23ba79285bf0a78224dc18ed24e1b3a4f12818277: Status 404 returned error can't find the container with id ca6ee4a8d81e0d7b4b73d5a23ba79285bf0a78224dc18ed24e1b3a4f12818277 Apr 24 22:30:15.828685 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.828652 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda27aa799_5d01_4cf6_8ad8_43d20950172c.slice/crio-7cbee6e41fda5a91ec56713be745ebd4edf7e52614770d184e10f4a9fc006005 WatchSource:0}: Error finding container 7cbee6e41fda5a91ec56713be745ebd4edf7e52614770d184e10f4a9fc006005: Status 404 returned error can't find the container with id 7cbee6e41fda5a91ec56713be745ebd4edf7e52614770d184e10f4a9fc006005 Apr 24 22:30:15.829304 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.829243 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92fe7641_e2dc_499a_a25d_09cdcdac368b.slice/crio-11dbfda3b10e2f82d891ac82507dc4e2931ee2ceace089a8b54624546214c778 WatchSource:0}: Error finding container 11dbfda3b10e2f82d891ac82507dc4e2931ee2ceace089a8b54624546214c778: Status 404 returned error can't find the container with id 11dbfda3b10e2f82d891ac82507dc4e2931ee2ceace089a8b54624546214c778 Apr 24 22:30:15.832235 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:30:15.831798 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea69dcf_def3_4c54_b774_dad54e40bced.slice/crio-b36319f6d72aa4b9b8737fe406ef3696447aba7779f36d38fc58dcca85e4447a WatchSource:0}: Error finding container b36319f6d72aa4b9b8737fe406ef3696447aba7779f36d38fc58dcca85e4447a: Status 404 returned error can't find the container with id b36319f6d72aa4b9b8737fe406ef3696447aba7779f36d38fc58dcca85e4447a Apr 24 22:30:15.854864 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.854841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:15.854940 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:15.854872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:15.854983 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.854971 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:15.855016 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.854979 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:15.855016 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.854993 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:15.855016 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.855002 2577 projected.go:194] Error preparing data for projected volume kube-api-access-hpzdl for pod openshift-network-diagnostics/network-check-target-2jhbl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:15.855102 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.855031 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret podName:dc6ae891-bdaf-4a23-bf88-ea8e267d1795 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:16.855016109 +0000 UTC m=+4.150362327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret") pod "global-pull-secret-syncer-g2bql" (UID: "dc6ae891-bdaf-4a23-bf88-ea8e267d1795") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:15.855102 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:15.855045 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl podName:9b1dcaeb-db0d-424b-9f57-c60a54511aa1 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:16.855039262 +0000 UTC m=+4.150385480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpzdl" (UniqueName: "kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl") pod "network-check-target-2jhbl" (UID: "9b1dcaeb-db0d-424b-9f57-c60a54511aa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:16.202319 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.202019 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:25:14 +0000 UTC" deadline="2028-01-30 00:57:55.688628358 +0000 UTC" Apr 24 22:30:16.202319 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.202268 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15482h27m39.486366756s" Apr 24 22:30:16.260529 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.260000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:16.260529 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.260123 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:16.275682 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.275640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gm79c" event={"ID":"0d65ad29-817e-44aa-888e-73e4eb91b70d","Type":"ContainerStarted","Data":"f3f907cd50d2f59bda375cd8e944f8da34785ececb2d25133c7774038369ab73"} Apr 24 22:30:16.279759 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.279685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j48ng" event={"ID":"a00368dd-8532-419e-919b-66cb1cf3e0c9","Type":"ContainerStarted","Data":"41a994a1a1095a1121ef728f81928399de56ed57088b89ab81cd8bd7e3cd9e6d"} Apr 24 22:30:16.284808 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.284781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rp5x6" event={"ID":"9ea69dcf-def3-4c54-b774-dad54e40bced","Type":"ContainerStarted","Data":"b36319f6d72aa4b9b8737fe406ef3696447aba7779f36d38fc58dcca85e4447a"} Apr 24 22:30:16.286123 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.286099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4v6p7" event={"ID":"92fe7641-e2dc-499a-a25d-09cdcdac368b","Type":"ContainerStarted","Data":"11dbfda3b10e2f82d891ac82507dc4e2931ee2ceace089a8b54624546214c778"} Apr 24 22:30:16.287671 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.287646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kbktf" event={"ID":"429121c4-84dd-4766-8b1d-bdc0550cffd5","Type":"ContainerStarted","Data":"49e3890aefd4e92b045553e0ddf4b93c9cfc4c97cdaa0ca2779395bda0221d97"} Apr 24 22:30:16.289278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.289242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerStarted","Data":"f7749da58a88050cf2292def14ae35f9b9e5b03e5a87271620d1f4c446e1076e"} Apr 24 22:30:16.292728 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.292105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" event={"ID":"1067ddb593414b000288d5e34ffef8f7","Type":"ContainerStarted","Data":"2525c9f0dccc97de5f69ee4c0b3b1677ce2c00d54e6854e1548ec03ce90cb471"} Apr 24 22:30:16.297149 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.297101 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" event={"ID":"a27aa799-5d01-4cf6-8ad8-43d20950172c","Type":"ContainerStarted","Data":"7cbee6e41fda5a91ec56713be745ebd4edf7e52614770d184e10f4a9fc006005"} Apr 24 22:30:16.300126 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.300067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"0900c43894e9ec5b446aecdb8ecc424e69625d52da61c6ae27ebeb0aadf0a4c3"} Apr 24 22:30:16.302652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.302628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" event={"ID":"a3b8675c-f9be-454b-aba6-002550b72a84","Type":"ContainerStarted","Data":"ca6ee4a8d81e0d7b4b73d5a23ba79285bf0a78224dc18ed24e1b3a4f12818277"} Apr 24 22:30:16.763520 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.762913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:16.763520 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.763102 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:16.763520 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.763169 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.763148844 +0000 UTC m=+6.058495065 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:16.864070 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.864016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:16.864070 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:16.864076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:16.864303 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.864263 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:16.864303 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.864285 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:16.864414 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.864312 2577 projected.go:194] Error preparing data for projected volume kube-api-access-hpzdl for pod openshift-network-diagnostics/network-check-target-2jhbl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:16.864414 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.864402 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl podName:9b1dcaeb-db0d-424b-9f57-c60a54511aa1 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.864381224 +0000 UTC m=+6.159727445 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpzdl" (UniqueName: "kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl") pod "network-check-target-2jhbl" (UID: "9b1dcaeb-db0d-424b-9f57-c60a54511aa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:16.864985 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.864962 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:16.865090 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:16.865019 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret podName:dc6ae891-bdaf-4a23-bf88-ea8e267d1795 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.865004401 +0000 UTC m=+6.160350626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret") pod "global-pull-secret-syncer-g2bql" (UID: "dc6ae891-bdaf-4a23-bf88-ea8e267d1795") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:17.261875 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:17.261840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:17.262457 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:17.261971 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:17.262457 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:17.262070 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:17.262457 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:17.262150 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:17.332580 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:17.332483 2577 generic.go:358] "Generic (PLEG): container finished" podID="392735cd9fd3420f91a267e1b155d1d3" containerID="2c6803635ae6a74d6ba9e896d8bb53a2b5711cd181b58a94815f0482c61455cc" exitCode=0 Apr 24 22:30:17.333579 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:17.333415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" event={"ID":"392735cd9fd3420f91a267e1b155d1d3","Type":"ContainerDied","Data":"2c6803635ae6a74d6ba9e896d8bb53a2b5711cd181b58a94815f0482c61455cc"} Apr 24 22:30:17.358268 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:17.358212 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-101.ec2.internal" podStartSLOduration=3.3581882849999998 podStartE2EDuration="3.358188285s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:16.317694273 +0000 UTC m=+3.613040512" watchObservedRunningTime="2026-04-24 22:30:17.358188285 +0000 UTC m=+4.653534526" Apr 24 22:30:18.260539 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:18.260039 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:18.260539 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.260166 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:18.337826 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:18.337120 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" event={"ID":"392735cd9fd3420f91a267e1b155d1d3","Type":"ContainerStarted","Data":"04dc5c5b60c4c17738f09a03a6681d6fa9d8f3790d8fa129ba7f700f42bdd019"} Apr 24 22:30:18.779600 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:18.779562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:18.779989 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.779788 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:18.779989 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.779883 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:22.779862878 +0000 UTC m=+10.075209100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:18.880820 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:18.880781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:18.880981 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:18.880843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:18.881060 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.881007 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:18.881060 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.881025 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:18.881060 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.881037 2577 projected.go:194] Error preparing data for projected volume kube-api-access-hpzdl for pod openshift-network-diagnostics/network-check-target-2jhbl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:18.881191 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.881098 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl podName:9b1dcaeb-db0d-424b-9f57-c60a54511aa1 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:22.881079115 +0000 UTC m=+10.176425350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpzdl" (UniqueName: "kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl") pod "network-check-target-2jhbl" (UID: "9b1dcaeb-db0d-424b-9f57-c60a54511aa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:18.881562 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.881530 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:18.881680 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:18.881588 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret podName:dc6ae891-bdaf-4a23-bf88-ea8e267d1795 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:22.881574489 +0000 UTC m=+10.176920709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret") pod "global-pull-secret-syncer-g2bql" (UID: "dc6ae891-bdaf-4a23-bf88-ea8e267d1795") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:19.259930 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:19.259857 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:19.260104 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:19.259990 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:19.260543 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:19.260518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:19.260686 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:19.260657 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:20.260074 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:20.260014 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:20.260539 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:20.260141 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:21.259906 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:21.259874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:21.259906 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:21.259900 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:21.260514 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:21.260022 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:21.260514 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:21.260155 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:22.259202 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:22.259170 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:22.259374 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.259302 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:22.816595 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:22.816548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:22.817160 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.816781 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:22.817160 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.816842 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.816825014 +0000 UTC m=+18.112171232 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:22.917531 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:22.917490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:22.917531 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:22.917544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:22.917733 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.917700 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:22.917733 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.917717 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:22.917733 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.917727 2577 projected.go:194] Error preparing data for projected volume kube-api-access-hpzdl for pod openshift-network-diagnostics/network-check-target-2jhbl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:22.917920 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.917791 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl podName:9b1dcaeb-db0d-424b-9f57-c60a54511aa1 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.917773781 +0000 UTC m=+18.213119998 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpzdl" (UniqueName: "kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl") pod "network-check-target-2jhbl" (UID: "9b1dcaeb-db0d-424b-9f57-c60a54511aa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:22.918125 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.918097 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:22.918275 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:22.918169 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret podName:dc6ae891-bdaf-4a23-bf88-ea8e267d1795 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:30.918149644 +0000 UTC m=+18.213495873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret") pod "global-pull-secret-syncer-g2bql" (UID: "dc6ae891-bdaf-4a23-bf88-ea8e267d1795") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:23.260947 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:23.260858 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:23.260947 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:23.260904 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:23.261178 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:23.260980 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:23.261178 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:23.261065 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:24.259472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:24.259438 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:24.260093 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:24.259580 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:25.259490 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:25.259443 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:25.259943 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:25.259455 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:25.259943 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:25.259566 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:25.259943 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:25.259666 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:26.259792 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:26.259735 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:26.260212 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:26.259869 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:27.259448 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:27.259407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:27.259618 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:27.259544 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:27.259618 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:27.259606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:27.259726 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:27.259695 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:28.259902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:28.259874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:28.260408 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:28.259978 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:29.260105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:29.260070 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:29.260863 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:29.260095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:29.260863 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:29.260217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:29.260863 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:29.260400 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:30.259384 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:30.259350 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:30.259654 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.259482 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:30.878513 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:30.878474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:30.878958 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.878654 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:30.878958 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.878732 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:46.878708733 +0000 UTC m=+34.174054963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:30.979309 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:30.979272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:30.979309 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:30.979320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:30.979562 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.979434 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:30.979843 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.979702 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:30.979843 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.979729 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:30.982053 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.979762 2577 projected.go:194] Error preparing data for projected volume kube-api-access-hpzdl for pod openshift-network-diagnostics/network-check-target-2jhbl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:30.982328 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.979917 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret podName:dc6ae891-bdaf-4a23-bf88-ea8e267d1795 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:46.979891496 +0000 UTC m=+34.275237735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret") pod "global-pull-secret-syncer-g2bql" (UID: "dc6ae891-bdaf-4a23-bf88-ea8e267d1795") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:30.983117 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:30.983093 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl podName:9b1dcaeb-db0d-424b-9f57-c60a54511aa1 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:46.982328185 +0000 UTC m=+34.277674413 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpzdl" (UniqueName: "kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl") pod "network-check-target-2jhbl" (UID: "9b1dcaeb-db0d-424b-9f57-c60a54511aa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:31.260005 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:31.259920 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:31.260164 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:31.260054 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:31.260422 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:31.260393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:31.260502 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:31.260483 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:32.259252 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:32.259200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:32.259730 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:32.259326 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:33.261126 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:33.261101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:33.261425 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:33.261217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:33.261425 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:33.261302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:33.261425 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:33.261407 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:34.260294 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.259977 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:34.260451 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:34.260320 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:34.367987 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.367948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rp5x6" event={"ID":"9ea69dcf-def3-4c54-b774-dad54e40bced","Type":"ContainerStarted","Data":"35c481ab51ff433c73824c5571685d74c379cae4afa6cf6b5a28574ccf42ad0b"} Apr 24 22:30:34.369815 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.369788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4v6p7" event={"ID":"92fe7641-e2dc-499a-a25d-09cdcdac368b","Type":"ContainerStarted","Data":"8ff4ee790d240fe2ebc36907f4370fd36745094163e62d2014c7aa6b606b105b"} Apr 24 22:30:34.372453 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.372429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kbktf" event={"ID":"429121c4-84dd-4766-8b1d-bdc0550cffd5","Type":"ContainerStarted","Data":"caee41f610df7bf1c984e0986b0e2694b12d5261cbb863a17408f7a494a2281b"} Apr 24 22:30:34.374044 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.374019 2577 generic.go:358] "Generic (PLEG): container finished" podID="4264f6dc-3224-4773-b9ba-8ad8e185093f" containerID="a64bccf004bc559661fa47be33ff94ad10b06903b171137a7b4a54ed449b18ba" exitCode=0 Apr 24 22:30:34.374317 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.374285 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerDied","Data":"a64bccf004bc559661fa47be33ff94ad10b06903b171137a7b4a54ed449b18ba"} Apr 24 22:30:34.375632 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.375602 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" event={"ID":"a27aa799-5d01-4cf6-8ad8-43d20950172c","Type":"ContainerStarted","Data":"6cabd802121f655023b484d11dd11a9cab79913063c7ce221814e4b615aeea9d"} Apr 24 22:30:34.378282 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:30:34.378575 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378553 2577 generic.go:358] "Generic (PLEG): container finished" podID="a41baed3-bda8-4fa8-a15e-3fdd5f955169" containerID="ff05d4983c5878f97ca4d5a59a7a0af4f0e5cd23cc526787d26f4cfc035ec035" exitCode=1 Apr 24 22:30:34.378659 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"4f26276da6c01ecd889ceb69479e6203432a04d064f4c800822f0d9e4b775c25"} Apr 24 22:30:34.378704 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378656 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"980f06668f306e9459de2687ae4258a07eceed1ac6392cf11858dbe4cf6d7e86"} Apr 24 22:30:34.378704 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378674 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"bd85a7691cc9df699b3593f8ca557c49f8033c0b82fe10b568b1e07743b52f93"} Apr 24 22:30:34.378704 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"fc6756d004473ff2d3a491e4c79b4b1503d8b1e661e9db06a468199355ed33fd"} Apr 24 22:30:34.378704 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerDied","Data":"ff05d4983c5878f97ca4d5a59a7a0af4f0e5cd23cc526787d26f4cfc035ec035"} Apr 24 22:30:34.378905 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.378711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"5ba63412ae9714d87626b390ff8b5dfc74b71cf065688ce66f51e0f202800080"} Apr 24 22:30:34.380342 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.380012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" event={"ID":"a3b8675c-f9be-454b-aba6-002550b72a84","Type":"ContainerStarted","Data":"67c0d810eff527dacd714b22526a67d81736ba7b751c4490d2c9b5b6397c0172"} Apr 24 22:30:34.381697 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.381363 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gm79c" event={"ID":"0d65ad29-817e-44aa-888e-73e4eb91b70d","Type":"ContainerStarted","Data":"15b6d45fc5aa52774d5ec6f127a4782e525026ac1bb8e0825f40d067f0cb50a4"} Apr 24 22:30:34.385394 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.385361 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-101.ec2.internal" podStartSLOduration=20.385351317 podStartE2EDuration="20.385351317s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:18.376793454 +0000 UTC m=+5.672139697" watchObservedRunningTime="2026-04-24 22:30:34.385351317 +0000 UTC m=+21.680697573" Apr 24 22:30:34.412278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.412236 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rp5x6" podStartSLOduration=4.084693744 podStartE2EDuration="21.412222924s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.832846792 +0000 UTC m=+3.128193011" lastFinishedPulling="2026-04-24 22:30:33.160375972 +0000 UTC m=+20.455722191" observedRunningTime="2026-04-24 22:30:34.385471416 +0000 UTC m=+21.680817650" watchObservedRunningTime="2026-04-24 22:30:34.412222924 +0000 UTC m=+21.707569164" Apr 24 22:30:34.437116 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.437077 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lk7ql" podStartSLOduration=3.992887479 podStartE2EDuration="21.43705541s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.829664397 +0000 UTC m=+3.125010616" lastFinishedPulling="2026-04-24 22:30:33.273832324 +0000 UTC m=+20.569178547" observedRunningTime="2026-04-24 22:30:34.412420123 +0000 UTC m=+21.707766373" watchObservedRunningTime="2026-04-24 22:30:34.43705541 +0000 UTC m=+21.732401629" Apr 24 22:30:34.457467 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.457429 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4v6p7" podStartSLOduration=4.015059835 podStartE2EDuration="21.457417614s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.832010027 +0000 UTC m=+3.127356245" lastFinishedPulling="2026-04-24 22:30:33.274367806 +0000 UTC m=+20.569714024" observedRunningTime="2026-04-24 22:30:34.457208331 +0000 UTC m=+21.752554581" watchObservedRunningTime="2026-04-24 22:30:34.457417614 +0000 UTC m=+21.752763854" Apr 24 22:30:34.493182 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.491390 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gm79c" podStartSLOduration=4.158540034 podStartE2EDuration="21.491371547s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.827546449 +0000 UTC m=+3.122892668" lastFinishedPulling="2026-04-24 22:30:33.160377959 +0000 UTC m=+20.455724181" observedRunningTime="2026-04-24 22:30:34.473012955 +0000 UTC m=+21.768359194" watchObservedRunningTime="2026-04-24 22:30:34.491371547 +0000 UTC m=+21.786717789" Apr 24 22:30:34.825610 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:34.825583 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:35.204376 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.204226 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:34.825604816Z","UUID":"02722357-db4b-481d-a2d0-012e0ea14976","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:35.207822 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.207796 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:35.207974 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.207829 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:35.259595 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.259566 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:35.259741 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:35.259660 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:35.259870 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.259852 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:35.260000 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:35.259978 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:35.385517 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.385482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" event={"ID":"a27aa799-5d01-4cf6-8ad8-43d20950172c","Type":"ContainerStarted","Data":"9a98b43e212fabaadd41c466a287192a78219875f5474b3510cf766dc36d6077"} Apr 24 22:30:35.386967 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.386921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j48ng" event={"ID":"a00368dd-8532-419e-919b-66cb1cf3e0c9","Type":"ContainerStarted","Data":"83d2bb4e5029d0e8086bfe5b8cf92499ff92477f5dd49c12fbc5d72b163067a1"} Apr 24 22:30:35.403189 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.403153 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kbktf" podStartSLOduration=4.831049745 podStartE2EDuration="22.403139737s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.823309426 +0000 UTC m=+3.118655658" lastFinishedPulling="2026-04-24 22:30:33.39539943 +0000 UTC m=+20.690745650" observedRunningTime="2026-04-24 22:30:34.491009793 +0000 UTC m=+21.786356034" watchObservedRunningTime="2026-04-24 22:30:35.403139737 +0000 UTC m=+22.698485977" Apr 24 22:30:35.403668 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:35.403637 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-j48ng" podStartSLOduration=5.019580683 podStartE2EDuration="22.403624908s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.827079144 +0000 UTC m=+3.122425362" lastFinishedPulling="2026-04-24 22:30:33.211123354 +0000 UTC m=+20.506469587" observedRunningTime="2026-04-24 22:30:35.402936465 +0000 UTC m=+22.698282705" watchObservedRunningTime="2026-04-24 22:30:35.403624908 +0000 UTC m=+22.698971150" Apr 24 22:30:36.259271 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:36.259200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:36.259415 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:36.259305 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:36.390257 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:36.390215 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" event={"ID":"a27aa799-5d01-4cf6-8ad8-43d20950172c","Type":"ContainerStarted","Data":"f71c1e443822e42d6a530818e97d1bc17fe7eff54acb9947cc0c906451b27e88"} Apr 24 22:30:36.393334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:36.393314 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:30:36.393711 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:36.393687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"4c122bcc0d1bb0cabb70b0dfe93bd98a9c59b0c4032369c6f0f2035ca3d70376"} Apr 24 22:30:36.422906 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:36.422842 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9djsm" podStartSLOduration=3.281833176 podStartE2EDuration="23.422824311s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.829964706 +0000 UTC m=+3.125310924" lastFinishedPulling="2026-04-24 22:30:35.970955838 +0000 UTC m=+23.266302059" observedRunningTime="2026-04-24 22:30:36.422601421 +0000 UTC m=+23.717947665" watchObservedRunningTime="2026-04-24 22:30:36.422824311 +0000 UTC m=+23.718170563" Apr 24 22:30:37.259897 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:37.259689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:37.260076 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:37.259689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:37.260076 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:37.259995 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:37.260076 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:37.260064 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:37.440946 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:37.440912 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:37.441602 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:37.441586 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:37.511376 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:37.511262 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:37.511848 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:37.511819 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gm79c" Apr 24 22:30:38.260147 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:38.260106 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:38.260334 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:38.260242 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:39.259906 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:39.259879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:39.259906 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:39.259879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:39.260376 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:39.259984 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:39.260376 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:39.260048 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:39.401600 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:39.401425 2577 generic.go:358] "Generic (PLEG): container finished" podID="4264f6dc-3224-4773-b9ba-8ad8e185093f" containerID="929222d7ce3f8f37a61a79a5ace06cc3ec1f12f9534dbd6884ac20116f631338" exitCode=0 Apr 24 22:30:39.401792 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:39.401512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerDied","Data":"929222d7ce3f8f37a61a79a5ace06cc3ec1f12f9534dbd6884ac20116f631338"} Apr 24 22:30:39.404983 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:39.404965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:30:39.405387 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:39.405365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"7ff755a86b61b8b98240cb3436976bad8ab88e30de0f58aa7436700c1dee07ce"} Apr 24 22:30:39.405790 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:39.405777 2577 scope.go:117] "RemoveContainer" containerID="ff05d4983c5878f97ca4d5a59a7a0af4f0e5cd23cc526787d26f4cfc035ec035" Apr 24 22:30:40.260093 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.260065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:40.260678 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:40.260167 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:40.365612 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.365578 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l8hdc"] Apr 24 22:30:40.365788 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.365706 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:40.365853 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:40.365826 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:40.369320 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.369291 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2jhbl"] Apr 24 22:30:40.369962 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.369936 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-g2bql"] Apr 24 22:30:40.370082 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.370066 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:40.370181 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:40.370161 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:40.411121 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.411060 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:30:40.411454 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.411396 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" event={"ID":"a41baed3-bda8-4fa8-a15e-3fdd5f955169","Type":"ContainerStarted","Data":"54e62263026b5a963f4f957283c07bafe84966548561442ce5a4231abf370952"} Apr 24 22:30:40.412078 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.412054 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:40.412202 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.412088 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:40.412202 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.412101 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:40.414134 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.414112 2577 generic.go:358] "Generic (PLEG): container finished" podID="4264f6dc-3224-4773-b9ba-8ad8e185093f" containerID="cc7194daaf838b892dc4589f3081bb0b4023c2fe70b0f704f523eae985cf1431" exitCode=0 Apr 24 22:30:40.414222 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.414178 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:40.414274 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.414211 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerDied","Data":"cc7194daaf838b892dc4589f3081bb0b4023c2fe70b0f704f523eae985cf1431"} Apr 24 22:30:40.414274 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:40.414265 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:40.427148 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.427126 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:40.427228 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.427202 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:30:40.453474 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:40.453424 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" podStartSLOduration=9.998611651000001 podStartE2EDuration="27.45340919s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.828385334 +0000 UTC m=+3.123731552" lastFinishedPulling="2026-04-24 22:30:33.283182873 +0000 UTC m=+20.578529091" observedRunningTime="2026-04-24 22:30:40.452967482 +0000 UTC m=+27.748313722" watchObservedRunningTime="2026-04-24 22:30:40.45340919 +0000 UTC m=+27.748755442" Apr 24 22:30:41.417602 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:41.417568 2577 generic.go:358] "Generic (PLEG): container finished" podID="4264f6dc-3224-4773-b9ba-8ad8e185093f" containerID="7a24ecd635e244a3b6be098d58bc202df8ed39bf87d9c24e518cef15a834052e" exitCode=0 Apr 24 22:30:41.418002 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:41.417626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerDied","Data":"7a24ecd635e244a3b6be098d58bc202df8ed39bf87d9c24e518cef15a834052e"} Apr 24 22:30:42.259596 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:42.259561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:42.259798 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:42.259561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:42.259798 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:42.259709 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:42.259921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:42.259795 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:42.259921 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:42.259561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:42.259921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:42.259886 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:44.259813 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:44.259603 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:44.260285 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:44.259633 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:44.260285 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:44.259900 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:44.260285 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:44.259672 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:44.260285 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:44.259993 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:44.260285 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:44.260093 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:46.260177 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.260102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:46.260620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.260102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:46.260620 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.260240 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g2bql" podUID="dc6ae891-bdaf-4a23-bf88-ea8e267d1795" Apr 24 22:30:46.260620 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.260301 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2jhbl" podUID="9b1dcaeb-db0d-424b-9f57-c60a54511aa1" Apr 24 22:30:46.260620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.260104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:46.260620 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.260379 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l8hdc" podUID="0466857c-9575-492e-9148-290f37031549" Apr 24 22:30:46.549207 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.549123 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-101.ec2.internal" event="NodeReady" Apr 24 22:30:46.549345 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.549279 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:46.639491 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.639456 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fl5t5"] Apr 24 22:30:46.681611 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.681578 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-48kr8"] Apr 24 22:30:46.681848 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.681811 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.685976 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.685954 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svq4d\"" Apr 24 22:30:46.686216 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.686199 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:46.686502 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.686479 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:46.693449 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.693429 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fl5t5"] Apr 24 22:30:46.693532 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.693457 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-48kr8"] Apr 24 22:30:46.693585 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.693554 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:46.700161 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.700140 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:46.700254 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.700170 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:46.700310 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.700268 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdthp\"" Apr 24 22:30:46.703364 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.703345 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:46.792386 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.792340 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:46.792386 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.792390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0fda03dc-9085-4272-aa29-583404383acf-tmp-dir\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.792634 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.792415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.792634 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.792431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzvm\" (UniqueName: \"kubernetes.io/projected/0fda03dc-9085-4272-aa29-583404383acf-kube-api-access-lwzvm\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.792634 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.792469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fda03dc-9085-4272-aa29-583404383acf-config-volume\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.792634 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.792490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djmzx\" (UniqueName: \"kubernetes.io/projected/7cda4d53-f962-44ab-a661-325196e9edf2-kube-api-access-djmzx\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:46.893643 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:46.893643 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0fda03dc-9085-4272-aa29-583404383acf-tmp-dir\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.893643 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzvm\" (UniqueName: \"kubernetes.io/projected/0fda03dc-9085-4272-aa29-583404383acf-kube-api-access-lwzvm\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.893709 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fda03dc-9085-4272-aa29-583404383acf-config-volume\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.893795 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.893814 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:47.393793687 +0000 UTC m=+34.689139930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.893817 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.893855 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:30:47.393835868 +0000 UTC m=+34.689182091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djmzx\" (UniqueName: \"kubernetes.io/projected/7cda4d53-f962-44ab-a661-325196e9edf2-kube-api-access-djmzx\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:46.893921 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.893904 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:18.893890717 +0000 UTC m=+66.189236938 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:46.894244 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.893947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0fda03dc-9085-4272-aa29-583404383acf-tmp-dir\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.894295 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.894279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fda03dc-9085-4272-aa29-583404383acf-config-volume\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.915842 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.915820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzvm\" (UniqueName: \"kubernetes.io/projected/0fda03dc-9085-4272-aa29-583404383acf-kube-api-access-lwzvm\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:46.921259 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.921241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djmzx\" (UniqueName: \"kubernetes.io/projected/7cda4d53-f962-44ab-a661-325196e9edf2-kube-api-access-djmzx\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:46.994198 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.994159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:46.994339 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:46.994263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:46.994339 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.994330 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:46.994451 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.994354 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:46.994451 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.994354 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:46.994451 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.994367 2577 projected.go:194] Error preparing data for projected volume kube-api-access-hpzdl for pod openshift-network-diagnostics/network-check-target-2jhbl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:46.994451 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.994415 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret podName:dc6ae891-bdaf-4a23-bf88-ea8e267d1795 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:18.994402297 +0000 UTC m=+66.289748515 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret") pod "global-pull-secret-syncer-g2bql" (UID: "dc6ae891-bdaf-4a23-bf88-ea8e267d1795") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:46.994451 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:46.994434 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl podName:9b1dcaeb-db0d-424b-9f57-c60a54511aa1 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:18.994424642 +0000 UTC m=+66.289770864 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hpzdl" (UniqueName: "kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl") pod "network-check-target-2jhbl" (UID: "9b1dcaeb-db0d-424b-9f57-c60a54511aa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:47.397326 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:47.397289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:47.397326 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:47.397331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:47.397714 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:47.397425 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:47.397714 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:47.397440 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:47.397714 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:47.397477 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:30:48.397463864 +0000 UTC m=+35.692810082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:30:47.397714 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:47.397499 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:48.397484139 +0000 UTC m=+35.692830358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:30:47.432698 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:47.432655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerStarted","Data":"c9940eb4183414d8ca737931ad2dbdf358a541ede2df98a4116fbbb410223dd3"} Apr 24 22:30:48.259762 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.259724 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:30:48.259762 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.259740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:30:48.259953 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.259740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:30:48.268531 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.268513 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:48.268627 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.268520 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:30:48.269877 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.269845 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dgz7\"" Apr 24 22:30:48.270045 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.269913 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:48.270045 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.270006 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dndrr\"" Apr 24 22:30:48.277969 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.277948 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:48.405036 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.404993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:48.405463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.405053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:48.405463 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:48.405151 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:48.405463 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:48.405216 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:50.405202293 +0000 UTC m=+37.700548511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:30:48.405463 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:48.405150 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:48.405463 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:48.405312 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:30:50.405294504 +0000 UTC m=+37.700640725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:30:48.437003 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.436975 2577 generic.go:358] "Generic (PLEG): container finished" podID="4264f6dc-3224-4773-b9ba-8ad8e185093f" containerID="c9940eb4183414d8ca737931ad2dbdf358a541ede2df98a4116fbbb410223dd3" exitCode=0 Apr 24 22:30:48.437126 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:48.437017 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerDied","Data":"c9940eb4183414d8ca737931ad2dbdf358a541ede2df98a4116fbbb410223dd3"} Apr 24 22:30:49.441452 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:49.441413 2577 generic.go:358] "Generic (PLEG): container finished" podID="4264f6dc-3224-4773-b9ba-8ad8e185093f" containerID="d10061a26fb788bf38489955d2e856abbcb10a2ed1b97efb5868a029b3d220dd" exitCode=0 Apr 24 22:30:49.441874 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:49.441459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerDied","Data":"d10061a26fb788bf38489955d2e856abbcb10a2ed1b97efb5868a029b3d220dd"} Apr 24 22:30:50.420835 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:50.420670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:50.420984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:50.420854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:50.420984 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:50.420861 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:50.420984 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:50.420937 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:54.420915695 +0000 UTC m=+41.716261924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:30:50.420984 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:50.420941 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:50.420984 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:50.420979 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:30:54.420968543 +0000 UTC m=+41.716314765 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:30:50.446229 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:50.446188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" event={"ID":"4264f6dc-3224-4773-b9ba-8ad8e185093f","Type":"ContainerStarted","Data":"d1595122718a704079ba4b3ae83fcc7578664ce7650e3f07f03a06a5815e6a5a"} Apr 24 22:30:50.483063 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:50.483009 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l8lg2" podStartSLOduration=6.179336971 podStartE2EDuration="37.482992026s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:15.821628393 +0000 UTC m=+3.116974612" lastFinishedPulling="2026-04-24 22:30:47.125283449 +0000 UTC m=+34.420629667" observedRunningTime="2026-04-24 22:30:50.482970408 +0000 UTC m=+37.778316647" watchObservedRunningTime="2026-04-24 22:30:50.482992026 +0000 UTC m=+37.778338258" Apr 24 22:30:54.447673 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:54.447639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:30:54.447673 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:30:54.447684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:30:54.448222 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:54.447792 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:54.448222 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:54.447796 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:54.448222 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:54.447844 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:31:02.447830816 +0000 UTC m=+49.743177033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:30:54.448222 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:30:54.447856 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:02.447850842 +0000 UTC m=+49.743197059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:31:02.507796 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:02.507728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:31:02.507796 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:02.507802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:31:02.508508 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:02.507906 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:02.508508 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:02.507942 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:02.508508 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:02.507998 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:31:18.507981329 +0000 UTC m=+65.803327547 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:31:02.508508 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:02.508012 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:18.50800567 +0000 UTC m=+65.803351889 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:31:12.430640 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:12.430613 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbfmd" Apr 24 22:31:18.522655 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:18.522613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:31:18.522655 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:18.522658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:31:18.523166 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:18.522770 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:18.523166 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:18.522772 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:18.523166 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:18.522830 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:50.522810739 +0000 UTC m=+97.818156960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:31:18.523166 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:18.522870 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:31:50.522864012 +0000 UTC m=+97.818210231 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:31:18.925578 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:18.925543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:31:18.928423 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:18.928405 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:31:18.935889 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:18.935871 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:31:18.935979 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:18.935937 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs podName:0466857c-9575-492e-9148-290f37031549 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:22.935917138 +0000 UTC m=+130.231263358 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs") pod "network-metrics-daemon-l8hdc" (UID: "0466857c-9575-492e-9148-290f37031549") : secret "metrics-daemon-secret" not found Apr 24 22:31:19.026416 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.026373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:31:19.026570 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.026472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:31:19.029445 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.029428 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:31:19.029521 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.029501 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:31:19.039877 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.039850 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:31:19.040138 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.040121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dc6ae891-bdaf-4a23-bf88-ea8e267d1795-original-pull-secret\") pod \"global-pull-secret-syncer-g2bql\" (UID: \"dc6ae891-bdaf-4a23-bf88-ea8e267d1795\") " pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:31:19.049419 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.049392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpzdl\" (UniqueName: \"kubernetes.io/projected/9b1dcaeb-db0d-424b-9f57-c60a54511aa1-kube-api-access-hpzdl\") pod \"network-check-target-2jhbl\" (UID: \"9b1dcaeb-db0d-424b-9f57-c60a54511aa1\") " pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:31:19.168042 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.168013 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g2bql" Apr 24 22:31:19.181769 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.181707 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dndrr\"" Apr 24 22:31:19.189021 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.188992 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:31:19.299511 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.299484 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-g2bql"] Apr 24 22:31:19.303095 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:31:19.303056 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6ae891_bdaf_4a23_bf88_ea8e267d1795.slice/crio-f27742722a1e762e7f4edf0078a74afdfcde314d39a6520d2699c05983613814 WatchSource:0}: Error finding container f27742722a1e762e7f4edf0078a74afdfcde314d39a6520d2699c05983613814: Status 404 returned error can't find the container with id f27742722a1e762e7f4edf0078a74afdfcde314d39a6520d2699c05983613814 Apr 24 22:31:19.313895 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.313866 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2jhbl"] Apr 24 22:31:19.317255 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:31:19.317232 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1dcaeb_db0d_424b_9f57_c60a54511aa1.slice/crio-3df39328e163fd174c8902065debe65c203439f5ad6d62415e8b5a8cf4a811c1 WatchSource:0}: Error finding container 3df39328e163fd174c8902065debe65c203439f5ad6d62415e8b5a8cf4a811c1: Status 404 returned error can't find the container with id 3df39328e163fd174c8902065debe65c203439f5ad6d62415e8b5a8cf4a811c1 Apr 24 22:31:19.501392 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.501326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2jhbl" event={"ID":"9b1dcaeb-db0d-424b-9f57-c60a54511aa1","Type":"ContainerStarted","Data":"3df39328e163fd174c8902065debe65c203439f5ad6d62415e8b5a8cf4a811c1"} Apr 24 22:31:19.502208 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:19.502183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-g2bql" event={"ID":"dc6ae891-bdaf-4a23-bf88-ea8e267d1795","Type":"ContainerStarted","Data":"f27742722a1e762e7f4edf0078a74afdfcde314d39a6520d2699c05983613814"} Apr 24 22:31:24.514723 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:24.514666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2jhbl" event={"ID":"9b1dcaeb-db0d-424b-9f57-c60a54511aa1","Type":"ContainerStarted","Data":"14796f849afcc2bd989b21fec1d9f9ab16cb512759c64b7b9589d856e2337595"} Apr 24 22:31:24.515210 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:24.514790 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:31:24.516036 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:24.516013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-g2bql" event={"ID":"dc6ae891-bdaf-4a23-bf88-ea8e267d1795","Type":"ContainerStarted","Data":"cfbd8c3ae6eee5b8d986675fb9902711c600f6d108ef9d1805164625a39fac52"} Apr 24 22:31:24.531960 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:24.531891 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2jhbl" podStartSLOduration=67.195090604 podStartE2EDuration="1m11.531879688s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:31:19.318932855 +0000 UTC m=+66.614279086" lastFinishedPulling="2026-04-24 22:31:23.655721952 +0000 UTC m=+70.951068170" observedRunningTime="2026-04-24 22:31:24.531523816 +0000 UTC m=+71.826870056" watchObservedRunningTime="2026-04-24 22:31:24.531879688 +0000 UTC m=+71.827225927" Apr 24 22:31:24.546873 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:24.546831 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-g2bql" podStartSLOduration=66.192527955 podStartE2EDuration="1m10.546819016s" podCreationTimestamp="2026-04-24 22:30:14 +0000 UTC" firstStartedPulling="2026-04-24 22:31:19.304684127 +0000 UTC m=+66.600030346" lastFinishedPulling="2026-04-24 22:31:23.65897519 +0000 UTC m=+70.954321407" observedRunningTime="2026-04-24 22:31:24.546624495 +0000 UTC m=+71.841970736" watchObservedRunningTime="2026-04-24 22:31:24.546819016 +0000 UTC m=+71.842165257" Apr 24 22:31:50.531918 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.531874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:31:50.531918 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.531911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:31:50.532338 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:50.532007 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:50.532338 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:50.532009 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:50.532338 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:50.532059 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls podName:0fda03dc-9085-4272-aa29-583404383acf nodeName:}" failed. No retries permitted until 2026-04-24 22:32:54.532043934 +0000 UTC m=+161.827390152 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls") pod "dns-default-fl5t5" (UID: "0fda03dc-9085-4272-aa29-583404383acf") : secret "dns-default-metrics-tls" not found Apr 24 22:31:50.532338 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:50.532072 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert podName:7cda4d53-f962-44ab-a661-325196e9edf2 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:54.532066489 +0000 UTC m=+161.827412707 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert") pod "ingress-canary-48kr8" (UID: "7cda4d53-f962-44ab-a661-325196e9edf2") : secret "canary-serving-cert" not found Apr 24 22:31:50.743070 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.743028 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv"] Apr 24 22:31:50.746929 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.746909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.750093 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.750075 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 22:31:50.750204 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.750122 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.750278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.750207 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.751293 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.751279 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 22:31:50.751379 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.751314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jdtsw\"" Apr 24 22:31:50.755947 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.755928 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv"] Apr 24 22:31:50.834543 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.834463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c42aa5d-90ad-4392-9988-c472ad007628-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.834543 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.834521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c42aa5d-90ad-4392-9988-c472ad007628-config\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.834703 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.834568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7g2d\" (UniqueName: \"kubernetes.io/projected/5c42aa5d-90ad-4392-9988-c472ad007628-kube-api-access-d7g2d\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.846796 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.846768 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lj9nb"] Apr 24 22:31:50.849483 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.849467 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:50.852263 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.852241 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 22:31:50.852396 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.852243 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:31:50.852531 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.852515 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 22:31:50.852587 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.852517 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-8r2bl\"" Apr 24 22:31:50.852587 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.852516 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:31:50.860343 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.860325 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lj9nb"] Apr 24 22:31:50.862201 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.862182 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 22:31:50.934859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.934831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2102ed1-69af-409e-b44a-dad1845de530-service-ca-bundle\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:50.935024 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.934884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c42aa5d-90ad-4392-9988-c472ad007628-config\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.935024 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.934904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7g2d\" (UniqueName: \"kubernetes.io/projected/5c42aa5d-90ad-4392-9988-c472ad007628-kube-api-access-d7g2d\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.935024 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.934921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c2102ed1-69af-409e-b44a-dad1845de530-snapshots\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:50.935133 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.935029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2102ed1-69af-409e-b44a-dad1845de530-tmp\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:50.935133 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.935076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2102ed1-69af-409e-b44a-dad1845de530-serving-cert\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:50.935133 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.935115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c42aa5d-90ad-4392-9988-c472ad007628-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.935226 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.935140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2102ed1-69af-409e-b44a-dad1845de530-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:50.935226 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.935165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q9s8\" (UniqueName: \"kubernetes.io/projected/c2102ed1-69af-409e-b44a-dad1845de530-kube-api-access-9q9s8\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:50.935390 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.935372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c42aa5d-90ad-4392-9988-c472ad007628-config\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.937390 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.937363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c42aa5d-90ad-4392-9988-c472ad007628-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:50.943107 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:50.943081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7g2d\" (UniqueName: \"kubernetes.io/projected/5c42aa5d-90ad-4392-9988-c472ad007628-kube-api-access-d7g2d\") pod \"service-ca-operator-d6fc45fc5-fxkzv\" (UID: \"5c42aa5d-90ad-4392-9988-c472ad007628\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:51.036357 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.036318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2102ed1-69af-409e-b44a-dad1845de530-service-ca-bundle\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.036357 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.036367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c2102ed1-69af-409e-b44a-dad1845de530-snapshots\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.036566 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.036446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2102ed1-69af-409e-b44a-dad1845de530-tmp\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.036566 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.036481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2102ed1-69af-409e-b44a-dad1845de530-serving-cert\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.036566 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.036515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2102ed1-69af-409e-b44a-dad1845de530-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.036566 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.036536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q9s8\" (UniqueName: \"kubernetes.io/projected/c2102ed1-69af-409e-b44a-dad1845de530-kube-api-access-9q9s8\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.037031 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.036992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2102ed1-69af-409e-b44a-dad1845de530-tmp\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.037031 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.037027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2102ed1-69af-409e-b44a-dad1845de530-service-ca-bundle\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.037239 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.037045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c2102ed1-69af-409e-b44a-dad1845de530-snapshots\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.037350 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.037330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2102ed1-69af-409e-b44a-dad1845de530-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.038962 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.038939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2102ed1-69af-409e-b44a-dad1845de530-serving-cert\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.045520 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.045498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q9s8\" (UniqueName: \"kubernetes.io/projected/c2102ed1-69af-409e-b44a-dad1845de530-kube-api-access-9q9s8\") pod \"insights-operator-585dfdc468-lj9nb\" (UID: \"c2102ed1-69af-409e-b44a-dad1845de530\") " pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.056312 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.056297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" Apr 24 22:31:51.162278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.162249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lj9nb" Apr 24 22:31:51.167912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.167890 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv"] Apr 24 22:31:51.171359 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:31:51.171330 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c42aa5d_90ad_4392_9988_c472ad007628.slice/crio-a888755edd4382ef8b2d7f93f9738035d4a8f13b95e364ee44056cb43c7a4cea WatchSource:0}: Error finding container a888755edd4382ef8b2d7f93f9738035d4a8f13b95e364ee44056cb43c7a4cea: Status 404 returned error can't find the container with id a888755edd4382ef8b2d7f93f9738035d4a8f13b95e364ee44056cb43c7a4cea Apr 24 22:31:51.271469 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.271436 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lj9nb"] Apr 24 22:31:51.275609 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:31:51.275585 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2102ed1_69af_409e_b44a_dad1845de530.slice/crio-7ab7b0b6bcc6947c472ce21b27cd2121e487fa6496789fc24641bde2f6944c7b WatchSource:0}: Error finding container 7ab7b0b6bcc6947c472ce21b27cd2121e487fa6496789fc24641bde2f6944c7b: Status 404 returned error can't find the container with id 7ab7b0b6bcc6947c472ce21b27cd2121e487fa6496789fc24641bde2f6944c7b Apr 24 22:31:51.565854 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.565812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" event={"ID":"5c42aa5d-90ad-4392-9988-c472ad007628","Type":"ContainerStarted","Data":"a888755edd4382ef8b2d7f93f9738035d4a8f13b95e364ee44056cb43c7a4cea"} Apr 24 22:31:51.566787 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:51.566740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lj9nb" event={"ID":"c2102ed1-69af-409e-b44a-dad1845de530","Type":"ContainerStarted","Data":"7ab7b0b6bcc6947c472ce21b27cd2121e487fa6496789fc24641bde2f6944c7b"} Apr 24 22:31:54.574414 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:54.574376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lj9nb" event={"ID":"c2102ed1-69af-409e-b44a-dad1845de530","Type":"ContainerStarted","Data":"ee39b17690ea8da45eca63f9f691db820f1665d71062a65b4f04e326d2995df1"} Apr 24 22:31:54.575691 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:54.575669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" event={"ID":"5c42aa5d-90ad-4392-9988-c472ad007628","Type":"ContainerStarted","Data":"28984f9e74993e1ca9c5ad6b56ac8ca3e821f0291290b74dcdca62a4ca0dab1d"} Apr 24 22:31:54.592402 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:54.592352 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-lj9nb" podStartSLOduration=2.248867745 podStartE2EDuration="4.592340971s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="2026-04-24 22:31:51.277215595 +0000 UTC m=+98.572561812" lastFinishedPulling="2026-04-24 22:31:53.620688817 +0000 UTC m=+100.916035038" observedRunningTime="2026-04-24 22:31:54.591592833 +0000 UTC m=+101.886939074" watchObservedRunningTime="2026-04-24 22:31:54.592340971 +0000 UTC m=+101.887687211" Apr 24 22:31:54.605934 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:54.605895 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" podStartSLOduration=2.1657481929999998 podStartE2EDuration="4.60588271s" podCreationTimestamp="2026-04-24 22:31:50 +0000 UTC" firstStartedPulling="2026-04-24 22:31:51.173083834 +0000 UTC m=+98.468430052" lastFinishedPulling="2026-04-24 22:31:53.613218343 +0000 UTC m=+100.908564569" observedRunningTime="2026-04-24 22:31:54.605487074 +0000 UTC m=+101.900833331" watchObservedRunningTime="2026-04-24 22:31:54.60588271 +0000 UTC m=+101.901228950" Apr 24 22:31:55.520526 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:55.520496 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2jhbl" Apr 24 22:31:56.432948 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.432915 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k"] Apr 24 22:31:56.459323 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.459297 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k"] Apr 24 22:31:56.459461 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.459413 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" Apr 24 22:31:56.483181 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.483159 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 22:31:56.483298 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.483163 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 22:31:56.484520 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.484493 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-rtmk9\"" Apr 24 22:31:56.575937 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.575904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzm6r\" (UniqueName: \"kubernetes.io/projected/32d08a62-7ae6-4083-b119-caaead1033c8-kube-api-access-tzm6r\") pod \"migrator-74bb7799d9-gbl9k\" (UID: \"32d08a62-7ae6-4083-b119-caaead1033c8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" Apr 24 22:31:56.676947 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.676909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzm6r\" (UniqueName: \"kubernetes.io/projected/32d08a62-7ae6-4083-b119-caaead1033c8-kube-api-access-tzm6r\") pod \"migrator-74bb7799d9-gbl9k\" (UID: \"32d08a62-7ae6-4083-b119-caaead1033c8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" Apr 24 22:31:56.686326 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.686274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzm6r\" (UniqueName: \"kubernetes.io/projected/32d08a62-7ae6-4083-b119-caaead1033c8-kube-api-access-tzm6r\") pod \"migrator-74bb7799d9-gbl9k\" (UID: \"32d08a62-7ae6-4083-b119-caaead1033c8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" Apr 24 22:31:56.768315 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.768283 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" Apr 24 22:31:56.875600 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.875577 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4v6p7_92fe7641-e2dc-499a-a25d-09cdcdac368b/dns-node-resolver/0.log" Apr 24 22:31:56.897879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:56.897853 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k"] Apr 24 22:31:57.582169 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:57.582132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" event={"ID":"32d08a62-7ae6-4083-b119-caaead1033c8","Type":"ContainerStarted","Data":"36734b6d63d5beaa28a0bbd993187cbd47cbd2a0aafd87bf059ff17c5f9900b4"} Apr 24 22:31:57.870331 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:57.870254 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rp5x6_9ea69dcf-def3-4c54-b774-dad54e40bced/node-ca/0.log" Apr 24 22:31:58.586177 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.586141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" event={"ID":"32d08a62-7ae6-4083-b119-caaead1033c8","Type":"ContainerStarted","Data":"f192a686314cea71873664effb1b0696fbd77e4fe2b83f132a25fad578ebc38e"} Apr 24 22:31:58.586177 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.586179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" event={"ID":"32d08a62-7ae6-4083-b119-caaead1033c8","Type":"ContainerStarted","Data":"f18531929bf7ecd0af614a2fb784ee375c43ef3de9323959e17e9da4ea91f2d5"} Apr 24 22:31:58.603047 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.602995 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gbl9k" podStartSLOduration=1.542113571 podStartE2EDuration="2.602980934s" podCreationTimestamp="2026-04-24 22:31:56 +0000 UTC" firstStartedPulling="2026-04-24 22:31:56.902153032 +0000 UTC m=+104.197499250" lastFinishedPulling="2026-04-24 22:31:57.963020395 +0000 UTC m=+105.258366613" observedRunningTime="2026-04-24 22:31:58.601764946 +0000 UTC m=+105.897111177" watchObservedRunningTime="2026-04-24 22:31:58.602980934 +0000 UTC m=+105.898327174" Apr 24 22:31:58.683359 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.683323 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jq4rv"] Apr 24 22:31:58.686387 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.686371 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.688802 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.688780 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 22:31:58.689063 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.689047 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 22:31:58.689143 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.689095 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 22:31:58.689205 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.689190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qfsfz\"" Apr 24 22:31:58.689362 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.689345 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 22:31:58.695135 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.695112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jq4rv"] Apr 24 22:31:58.790541 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.790506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3d1027b6-dd68-4226-8009-74a6f7c62f29-signing-cabundle\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.790713 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.790563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbmcz\" (UniqueName: \"kubernetes.io/projected/3d1027b6-dd68-4226-8009-74a6f7c62f29-kube-api-access-gbmcz\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.790713 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.790648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3d1027b6-dd68-4226-8009-74a6f7c62f29-signing-key\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.891866 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.891790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3d1027b6-dd68-4226-8009-74a6f7c62f29-signing-cabundle\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.891866 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.891837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbmcz\" (UniqueName: \"kubernetes.io/projected/3d1027b6-dd68-4226-8009-74a6f7c62f29-kube-api-access-gbmcz\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.892051 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.891889 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3d1027b6-dd68-4226-8009-74a6f7c62f29-signing-key\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.892458 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.892438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3d1027b6-dd68-4226-8009-74a6f7c62f29-signing-cabundle\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.894385 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.894367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3d1027b6-dd68-4226-8009-74a6f7c62f29-signing-key\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.900891 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.900872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbmcz\" (UniqueName: \"kubernetes.io/projected/3d1027b6-dd68-4226-8009-74a6f7c62f29-kube-api-access-gbmcz\") pod \"service-ca-865cb79987-jq4rv\" (UID: \"3d1027b6-dd68-4226-8009-74a6f7c62f29\") " pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:58.994931 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:58.994895 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jq4rv" Apr 24 22:31:59.109099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.109061 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jq4rv"] Apr 24 22:31:59.113215 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:31:59.113189 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d1027b6_dd68_4226_8009_74a6f7c62f29.slice/crio-8c9bf9bd0b4b50745ec99c3f8f2196961e6739a72cf34c5842e59d0384f1dd05 WatchSource:0}: Error finding container 8c9bf9bd0b4b50745ec99c3f8f2196961e6739a72cf34c5842e59d0384f1dd05: Status 404 returned error can't find the container with id 8c9bf9bd0b4b50745ec99c3f8f2196961e6739a72cf34c5842e59d0384f1dd05 Apr 24 22:31:59.182954 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.182932 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-bfbcbcd4d-g6dfw"] Apr 24 22:31:59.185839 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.185827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.188482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.188460 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:31:59.188902 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.188868 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xzwft\"" Apr 24 22:31:59.189051 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.188972 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:31:59.189215 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.189191 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:31:59.195718 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.195692 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:31:59.200190 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.200169 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bfbcbcd4d-g6dfw"] Apr 24 22:31:59.294245 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-installation-pull-secrets\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.294393 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294283 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-certificates\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.294393 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33723fd1-590b-4af7-ba53-6d39a38013fd-ca-trust-extracted\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.294393 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-trusted-ca\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.294506 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.294506 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvhb8\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-kube-api-access-wvhb8\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.294562 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-image-registry-private-configuration\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.294562 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.294532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-bound-sa-token\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.395677 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvhb8\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-kube-api-access-wvhb8\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.395894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-image-registry-private-configuration\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.395894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-bound-sa-token\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.395894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-installation-pull-secrets\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.395894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-certificates\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.395894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33723fd1-590b-4af7-ba53-6d39a38013fd-ca-trust-extracted\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.395894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-trusted-ca\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.396224 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.395935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.396224 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:59.396043 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:31:59.396224 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:59.396057 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bfbcbcd4d-g6dfw: secret "image-registry-tls" not found Apr 24 22:31:59.396224 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:59.396118 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls podName:33723fd1-590b-4af7-ba53-6d39a38013fd nodeName:}" failed. No retries permitted until 2026-04-24 22:31:59.89609588 +0000 UTC m=+107.191442100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls") pod "image-registry-bfbcbcd4d-g6dfw" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd") : secret "image-registry-tls" not found Apr 24 22:31:59.396442 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.396338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33723fd1-590b-4af7-ba53-6d39a38013fd-ca-trust-extracted\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.396493 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.396480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-certificates\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.396824 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.396801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-trusted-ca\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.398361 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.398337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-image-registry-private-configuration\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.398470 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.398404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-installation-pull-secrets\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.405971 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.405951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-bound-sa-token\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.406201 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.406183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvhb8\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-kube-api-access-wvhb8\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.591359 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.591323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jq4rv" event={"ID":"3d1027b6-dd68-4226-8009-74a6f7c62f29","Type":"ContainerStarted","Data":"0b5bf9eccbb0a8ce8d8c4c14dc9622dfb3fd4cab8b4a04b180126dc72510f117"} Apr 24 22:31:59.591359 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.591363 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jq4rv" event={"ID":"3d1027b6-dd68-4226-8009-74a6f7c62f29","Type":"ContainerStarted","Data":"8c9bf9bd0b4b50745ec99c3f8f2196961e6739a72cf34c5842e59d0384f1dd05"} Apr 24 22:31:59.610384 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.610339 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jq4rv" podStartSLOduration=1.610324957 podStartE2EDuration="1.610324957s" podCreationTimestamp="2026-04-24 22:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:59.609076194 +0000 UTC m=+106.904422434" watchObservedRunningTime="2026-04-24 22:31:59.610324957 +0000 UTC m=+106.905671198" Apr 24 22:31:59.900641 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:31:59.900593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:31:59.900844 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:59.900741 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:31:59.900844 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:59.900783 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bfbcbcd4d-g6dfw: secret "image-registry-tls" not found Apr 24 22:31:59.900844 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:31:59.900837 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls podName:33723fd1-590b-4af7-ba53-6d39a38013fd nodeName:}" failed. No retries permitted until 2026-04-24 22:32:00.900824004 +0000 UTC m=+108.196170221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls") pod "image-registry-bfbcbcd4d-g6dfw" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd") : secret "image-registry-tls" not found Apr 24 22:32:00.908780 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:00.908716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:00.909240 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:00.908915 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:00.909240 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:00.908932 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bfbcbcd4d-g6dfw: secret "image-registry-tls" not found Apr 24 22:32:00.909240 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:00.908991 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls podName:33723fd1-590b-4af7-ba53-6d39a38013fd nodeName:}" failed. No retries permitted until 2026-04-24 22:32:02.908971073 +0000 UTC m=+110.204317294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls") pod "image-registry-bfbcbcd4d-g6dfw" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd") : secret "image-registry-tls" not found Apr 24 22:32:02.926235 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:02.926187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:02.926655 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:02.926346 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:02.926655 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:02.926366 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bfbcbcd4d-g6dfw: secret "image-registry-tls" not found Apr 24 22:32:02.926655 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:02.926427 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls podName:33723fd1-590b-4af7-ba53-6d39a38013fd nodeName:}" failed. No retries permitted until 2026-04-24 22:32:06.926406054 +0000 UTC m=+114.221752285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls") pod "image-registry-bfbcbcd4d-g6dfw" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd") : secret "image-registry-tls" not found Apr 24 22:32:06.958098 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:06.958055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:06.960638 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:06.960610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"image-registry-bfbcbcd4d-g6dfw\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:06.994480 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:06.994447 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:07.115441 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:07.115410 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bfbcbcd4d-g6dfw"] Apr 24 22:32:07.120382 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:07.120353 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33723fd1_590b_4af7_ba53_6d39a38013fd.slice/crio-e497d7806ac4136ff7dd8254a97d75082fcbbb76c42eced726aeae66d0c6473e WatchSource:0}: Error finding container e497d7806ac4136ff7dd8254a97d75082fcbbb76c42eced726aeae66d0c6473e: Status 404 returned error can't find the container with id e497d7806ac4136ff7dd8254a97d75082fcbbb76c42eced726aeae66d0c6473e Apr 24 22:32:07.611690 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:07.611655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" event={"ID":"33723fd1-590b-4af7-ba53-6d39a38013fd","Type":"ContainerStarted","Data":"78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407"} Apr 24 22:32:07.611690 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:07.611692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" event={"ID":"33723fd1-590b-4af7-ba53-6d39a38013fd","Type":"ContainerStarted","Data":"e497d7806ac4136ff7dd8254a97d75082fcbbb76c42eced726aeae66d0c6473e"} Apr 24 22:32:07.611899 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:07.611789 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:07.633135 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:07.633090 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" podStartSLOduration=8.633075492 podStartE2EDuration="8.633075492s" podCreationTimestamp="2026-04-24 22:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:07.631778211 +0000 UTC m=+114.927124447" watchObservedRunningTime="2026-04-24 22:32:07.633075492 +0000 UTC m=+114.928421731" Apr 24 22:32:22.128742 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.128700 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9"] Apr 24 22:32:22.133631 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.133609 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.137336 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.137315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dz4pr\"" Apr 24 22:32:22.137336 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.137334 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 22:32:22.137470 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.137315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 22:32:22.142517 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.142472 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bfbcbcd4d-g6dfw"] Apr 24 22:32:22.143730 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.143698 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9"] Apr 24 22:32:22.146730 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.146709 2577 patch_prober.go:28] interesting pod/image-registry-bfbcbcd4d-g6dfw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:32:22.146844 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.146801 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:32:22.254264 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.254236 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mdrlq"] Apr 24 22:32:22.257424 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.257409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.260642 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.260623 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dwrxh\"" Apr 24 22:32:22.260880 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.260866 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:32:22.261001 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.260984 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:32:22.277091 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.277063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5dc0ab6c-0ed6-435b-9203-3352b2d3e68e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wkmx9\" (UID: \"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.277187 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.277100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5dc0ab6c-0ed6-435b-9203-3352b2d3e68e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wkmx9\" (UID: \"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.281214 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.281195 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mdrlq"] Apr 24 22:32:22.378100 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c309811-846a-4c71-843a-c45c3f281cce-crio-socket\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.378219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c309811-846a-4c71-843a-c45c3f281cce-data-volume\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.378219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c309811-846a-4c71-843a-c45c3f281cce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.378219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5rq\" (UniqueName: \"kubernetes.io/projected/0c309811-846a-4c71-843a-c45c3f281cce-kube-api-access-hx5rq\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.378219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5dc0ab6c-0ed6-435b-9203-3352b2d3e68e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wkmx9\" (UID: \"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.378219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c309811-846a-4c71-843a-c45c3f281cce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.378219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5dc0ab6c-0ed6-435b-9203-3352b2d3e68e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wkmx9\" (UID: \"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.378728 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.378707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5dc0ab6c-0ed6-435b-9203-3352b2d3e68e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wkmx9\" (UID: \"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.380709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.380663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5dc0ab6c-0ed6-435b-9203-3352b2d3e68e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wkmx9\" (UID: \"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.442709 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.442683 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" Apr 24 22:32:22.479340 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c309811-846a-4c71-843a-c45c3f281cce-crio-socket\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.479473 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c309811-846a-4c71-843a-c45c3f281cce-data-volume\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.479473 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c309811-846a-4c71-843a-c45c3f281cce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.479473 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5rq\" (UniqueName: \"kubernetes.io/projected/0c309811-846a-4c71-843a-c45c3f281cce-kube-api-access-hx5rq\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.479473 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0c309811-846a-4c71-843a-c45c3f281cce-crio-socket\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.479473 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c309811-846a-4c71-843a-c45c3f281cce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.479777 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0c309811-846a-4c71-843a-c45c3f281cce-data-volume\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.480016 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.479997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0c309811-846a-4c71-843a-c45c3f281cce-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.481834 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.481817 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0c309811-846a-4c71-843a-c45c3f281cce-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.491064 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.491025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5rq\" (UniqueName: \"kubernetes.io/projected/0c309811-846a-4c71-843a-c45c3f281cce-kube-api-access-hx5rq\") pod \"insights-runtime-extractor-mdrlq\" (UID: \"0c309811-846a-4c71-843a-c45c3f281cce\") " pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.564392 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.564362 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9"] Apr 24 22:32:22.566289 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.566270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mdrlq" Apr 24 22:32:22.567817 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:22.567787 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dc0ab6c_0ed6_435b_9203_3352b2d3e68e.slice/crio-cb0337ef464884388a1b67e51e918b91af31398ccdb83675e708d1a442df3385 WatchSource:0}: Error finding container cb0337ef464884388a1b67e51e918b91af31398ccdb83675e708d1a442df3385: Status 404 returned error can't find the container with id cb0337ef464884388a1b67e51e918b91af31398ccdb83675e708d1a442df3385 Apr 24 22:32:22.647214 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.647180 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" event={"ID":"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e","Type":"ContainerStarted","Data":"cb0337ef464884388a1b67e51e918b91af31398ccdb83675e708d1a442df3385"} Apr 24 22:32:22.685872 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.685846 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mdrlq"] Apr 24 22:32:22.689106 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:22.689081 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c309811_846a_4c71_843a_c45c3f281cce.slice/crio-0365e52f80ae9281281292edb5998a5be4959f7624933bbcda25dafc66c3f398 WatchSource:0}: Error finding container 0365e52f80ae9281281292edb5998a5be4959f7624933bbcda25dafc66c3f398: Status 404 returned error can't find the container with id 0365e52f80ae9281281292edb5998a5be4959f7624933bbcda25dafc66c3f398 Apr 24 22:32:22.985037 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.984996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:32:22.987642 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:22.987612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0466857c-9575-492e-9148-290f37031549-metrics-certs\") pod \"network-metrics-daemon-l8hdc\" (UID: \"0466857c-9575-492e-9148-290f37031549\") " pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:32:23.077031 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.076994 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dgz7\"" Apr 24 22:32:23.085268 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.085241 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l8hdc" Apr 24 22:32:23.208018 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.207988 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l8hdc"] Apr 24 22:32:23.407761 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:23.407672 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0466857c_9575_492e_9148_290f37031549.slice/crio-0bfc5cb6e32d3eec157b6db35a48e0cecdb8a2b030e3f82682a18404f5d5a59d WatchSource:0}: Error finding container 0bfc5cb6e32d3eec157b6db35a48e0cecdb8a2b030e3f82682a18404f5d5a59d: Status 404 returned error can't find the container with id 0bfc5cb6e32d3eec157b6db35a48e0cecdb8a2b030e3f82682a18404f5d5a59d Apr 24 22:32:23.650936 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.650893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" event={"ID":"5dc0ab6c-0ed6-435b-9203-3352b2d3e68e","Type":"ContainerStarted","Data":"921f68d17638ac19e4cee971a1b5cd0ffcb73846c193db56f08fd299d8dd58d4"} Apr 24 22:32:23.652399 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.652375 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mdrlq" event={"ID":"0c309811-846a-4c71-843a-c45c3f281cce","Type":"ContainerStarted","Data":"b33ef64e913c8a136d1e73fdaee6273a42973d207509aab2834c4532f966ffce"} Apr 24 22:32:23.652521 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.652403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mdrlq" event={"ID":"0c309811-846a-4c71-843a-c45c3f281cce","Type":"ContainerStarted","Data":"9064a7f975aa72b87fa853cd5d9d9d17dce8d2651fd434ca40252ed43a47a4f3"} Apr 24 22:32:23.652521 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.652417 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mdrlq" event={"ID":"0c309811-846a-4c71-843a-c45c3f281cce","Type":"ContainerStarted","Data":"0365e52f80ae9281281292edb5998a5be4959f7624933bbcda25dafc66c3f398"} Apr 24 22:32:23.653287 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.653266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l8hdc" event={"ID":"0466857c-9575-492e-9148-290f37031549","Type":"ContainerStarted","Data":"0bfc5cb6e32d3eec157b6db35a48e0cecdb8a2b030e3f82682a18404f5d5a59d"} Apr 24 22:32:23.667682 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:23.667639 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wkmx9" podStartSLOduration=0.698070196 podStartE2EDuration="1.667627821s" podCreationTimestamp="2026-04-24 22:32:22 +0000 UTC" firstStartedPulling="2026-04-24 22:32:22.57078295 +0000 UTC m=+129.866129168" lastFinishedPulling="2026-04-24 22:32:23.540340569 +0000 UTC m=+130.835686793" observedRunningTime="2026-04-24 22:32:23.666935477 +0000 UTC m=+130.962281717" watchObservedRunningTime="2026-04-24 22:32:23.667627821 +0000 UTC m=+130.962974117" Apr 24 22:32:25.663984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:25.663944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mdrlq" event={"ID":"0c309811-846a-4c71-843a-c45c3f281cce","Type":"ContainerStarted","Data":"192ea1581e1a6b2f3f787a0e0f358dd5be40d06db84dadcbcee685f6f3f571c7"} Apr 24 22:32:25.665414 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:25.665378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l8hdc" event={"ID":"0466857c-9575-492e-9148-290f37031549","Type":"ContainerStarted","Data":"6de326a31f68b58f422a470f4c14e37c79dda14e6d249e6514c60d92b697a7a8"} Apr 24 22:32:25.665414 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:25.665406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l8hdc" event={"ID":"0466857c-9575-492e-9148-290f37031549","Type":"ContainerStarted","Data":"76168473d87c91842aea8b53690c315b02e03ed38dc8d772faca828dd6e83e66"} Apr 24 22:32:25.683101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:25.683048 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mdrlq" podStartSLOduration=1.418464586 podStartE2EDuration="3.68303291s" podCreationTimestamp="2026-04-24 22:32:22 +0000 UTC" firstStartedPulling="2026-04-24 22:32:22.742278982 +0000 UTC m=+130.037625200" lastFinishedPulling="2026-04-24 22:32:25.006847299 +0000 UTC m=+132.302193524" observedRunningTime="2026-04-24 22:32:25.68296438 +0000 UTC m=+132.978310621" watchObservedRunningTime="2026-04-24 22:32:25.68303291 +0000 UTC m=+132.978379151" Apr 24 22:32:25.703808 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:25.703762 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l8hdc" podStartSLOduration=131.109379658 podStartE2EDuration="2m12.703731621s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:32:23.409558802 +0000 UTC m=+130.704905019" lastFinishedPulling="2026-04-24 22:32:25.003910765 +0000 UTC m=+132.299256982" observedRunningTime="2026-04-24 22:32:25.702090471 +0000 UTC m=+132.997436710" watchObservedRunningTime="2026-04-24 22:32:25.703731621 +0000 UTC m=+132.999077861" Apr 24 22:32:32.147019 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:32.146986 2577 patch_prober.go:28] interesting pod/image-registry-bfbcbcd4d-g6dfw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:32:32.147378 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:32.147035 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:32:34.336046 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.336004 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w"] Apr 24 22:32:34.340637 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.340614 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.343298 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.343275 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 22:32:34.344611 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.344589 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:32:34.344706 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.344631 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:32:34.344706 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.344647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5kmhk\"" Apr 24 22:32:34.344845 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.344589 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:32:34.344954 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.344930 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:32:34.348266 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.348248 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8jtkx"] Apr 24 22:32:34.353305 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.353287 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w"] Apr 24 22:32:34.353414 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.353389 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.356496 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.356474 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8rvqf\"" Apr 24 22:32:34.356595 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.356581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:32:34.356739 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.356474 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:32:34.356853 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.356834 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:32:34.475515 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-metrics-client-ca\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.475515 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475518 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f9e77c1-2948-4c92-9f23-a02949e0904b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.475777 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-root\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.475777 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-tls\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.475777 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r42md\" (UniqueName: \"kubernetes.io/projected/9f9e77c1-2948-4c92-9f23-a02949e0904b-kube-api-access-r42md\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.475777 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.475777 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-textfile\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.475777 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475780 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45j88\" (UniqueName: \"kubernetes.io/projected/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-kube-api-access-45j88\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.476064 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-wtmp\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.476064 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-sys\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.476064 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.476064 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f9e77c1-2948-4c92-9f23-a02949e0904b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.476064 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.475978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f9e77c1-2948-4c92-9f23-a02949e0904b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.577114 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-root\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577114 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-tls\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r42md\" (UniqueName: \"kubernetes.io/projected/9f9e77c1-2948-4c92-9f23-a02949e0904b-kube-api-access-r42md\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.577334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-root\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-textfile\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577334 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:34.577296 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 22:32:34.577558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45j88\" (UniqueName: \"kubernetes.io/projected/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-kube-api-access-45j88\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577558 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:34.577356 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-tls podName:a56a8cc4-5a2f-4884-80eb-0c8d1dee9455 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:35.077336494 +0000 UTC m=+142.372682719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-tls") pod "node-exporter-8jtkx" (UID: "a56a8cc4-5a2f-4884-80eb-0c8d1dee9455") : secret "node-exporter-tls" not found Apr 24 22:32:34.577558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-wtmp\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-sys\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f9e77c1-2948-4c92-9f23-a02949e0904b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.577859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-wtmp\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f9e77c1-2948-4c92-9f23-a02949e0904b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.577859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-sys\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-metrics-client-ca\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-textfile\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.577859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.577696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f9e77c1-2948-4c92-9f23-a02949e0904b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.578389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.578295 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f9e77c1-2948-4c92-9f23-a02949e0904b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.578389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.578419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-metrics-client-ca\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.578389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.578419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.580352 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.580328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f9e77c1-2948-4c92-9f23-a02949e0904b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.580628 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.580605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.580742 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.580718 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f9e77c1-2948-4c92-9f23-a02949e0904b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.591444 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.591396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r42md\" (UniqueName: \"kubernetes.io/projected/9f9e77c1-2948-4c92-9f23-a02949e0904b-kube-api-access-r42md\") pod \"openshift-state-metrics-9d44df66c-fkm5w\" (UID: \"9f9e77c1-2948-4c92-9f23-a02949e0904b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.593094 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.593076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45j88\" (UniqueName: \"kubernetes.io/projected/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-kube-api-access-45j88\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:34.651250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.651218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" Apr 24 22:32:34.776916 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:34.776850 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w"] Apr 24 22:32:34.780190 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:34.780157 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f9e77c1_2948_4c92_9f23_a02949e0904b.slice/crio-2dc6d8c71dc7b1e113a61220f20ca61ad51cad484132ed5b6ec5317a39bf76f7 WatchSource:0}: Error finding container 2dc6d8c71dc7b1e113a61220f20ca61ad51cad484132ed5b6ec5317a39bf76f7: Status 404 returned error can't find the container with id 2dc6d8c71dc7b1e113a61220f20ca61ad51cad484132ed5b6ec5317a39bf76f7 Apr 24 22:32:35.083113 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.083075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-tls\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:35.085434 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.085415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a56a8cc4-5a2f-4884-80eb-0c8d1dee9455-node-exporter-tls\") pod \"node-exporter-8jtkx\" (UID: \"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455\") " pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:35.264268 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.264235 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8jtkx" Apr 24 22:32:35.272666 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:35.272633 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda56a8cc4_5a2f_4884_80eb_0c8d1dee9455.slice/crio-dce77e15de7c21443e2b9839b3981f906ca8b8e89e042b410b32d6ad4d3bb737 WatchSource:0}: Error finding container dce77e15de7c21443e2b9839b3981f906ca8b8e89e042b410b32d6ad4d3bb737: Status 404 returned error can't find the container with id dce77e15de7c21443e2b9839b3981f906ca8b8e89e042b410b32d6ad4d3bb737 Apr 24 22:32:35.405051 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.405013 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:35.409040 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.409016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.412597 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.412388 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 22:32:35.412597 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.412398 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 22:32:35.412597 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.412403 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 22:32:35.412891 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.412788 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 22:32:35.412891 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.412819 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 22:32:35.412891 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.412869 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 22:32:35.413042 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.412821 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rqn8z\"" Apr 24 22:32:35.413042 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.413028 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 22:32:35.413143 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.413079 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 22:32:35.426482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.420879 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 22:32:35.426482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.425418 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:35.587112 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587112 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587321 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587321 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-out\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587321 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-volume\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-web-config\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkzbq\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-kube-api-access-mkzbq\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587734 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587734 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.587734 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.587594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689064 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689246 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689246 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689246 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689246 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689169 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-out\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689246 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689246 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689576 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-volume\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689576 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-web-config\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689576 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzbq\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-kube-api-access-mkzbq\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689576 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689576 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.689576 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.689405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.691363 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.691015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.691363 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.691024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.692524 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.692495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.692621 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.692541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.693047 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.692851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.693152 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.693045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.693500 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.693473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-out\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.694097 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.694053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.694277 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.694252 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-volume\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.695091 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.695069 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-web-config\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.695326 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.695250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.695326 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.695254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.696770 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.696697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jtkx" event={"ID":"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455","Type":"ContainerStarted","Data":"dce77e15de7c21443e2b9839b3981f906ca8b8e89e042b410b32d6ad4d3bb737"} Apr 24 22:32:35.698564 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.698536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" event={"ID":"9f9e77c1-2948-4c92-9f23-a02949e0904b","Type":"ContainerStarted","Data":"10e6eec88be9d60565754bab6a9bf9a1d9103e9eec9e682b6d4b98d53af8eaed"} Apr 24 22:32:35.698665 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.698573 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" event={"ID":"9f9e77c1-2948-4c92-9f23-a02949e0904b","Type":"ContainerStarted","Data":"63d3b0839ea34686ab708966bd88c5c715fb018fbb58bff226a47013556b2740"} Apr 24 22:32:35.698665 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.698586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" event={"ID":"9f9e77c1-2948-4c92-9f23-a02949e0904b","Type":"ContainerStarted","Data":"2dc6d8c71dc7b1e113a61220f20ca61ad51cad484132ed5b6ec5317a39bf76f7"} Apr 24 22:32:35.698875 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.698855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzbq\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-kube-api-access-mkzbq\") pod \"alertmanager-main-0\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.723076 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.723050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:35.963352 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:35.963291 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:35.969482 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:35.969448 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd385150b_b9ae_4267_852b_d4bf79c4b9ab.slice/crio-c6924d4b535a1d6fcd51e8ec30756cd241a56a01a3f282bfb5e6847971913cab WatchSource:0}: Error finding container c6924d4b535a1d6fcd51e8ec30756cd241a56a01a3f282bfb5e6847971913cab: Status 404 returned error can't find the container with id c6924d4b535a1d6fcd51e8ec30756cd241a56a01a3f282bfb5e6847971913cab Apr 24 22:32:36.228824 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.228787 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-4rldg"] Apr 24 22:32:36.232127 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.232111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4rldg" Apr 24 22:32:36.234718 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.234696 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 22:32:36.234833 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.234716 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-dsbgq\"" Apr 24 22:32:36.234833 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.234716 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 22:32:36.240963 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.240916 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4rldg"] Apr 24 22:32:36.396270 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.396238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8gl4\" (UniqueName: \"kubernetes.io/projected/d71288e3-0a52-4c68-9d3b-643365a30e02-kube-api-access-b8gl4\") pod \"downloads-6bcc868b7-4rldg\" (UID: \"d71288e3-0a52-4c68-9d3b-643365a30e02\") " pod="openshift-console/downloads-6bcc868b7-4rldg" Apr 24 22:32:36.397474 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.397443 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-646b8cbfc-fqdkv"] Apr 24 22:32:36.401515 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.401497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.407451 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.407429 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-cd4ibmpkprr36\"" Apr 24 22:32:36.408015 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.407994 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 22:32:36.408184 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.408118 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-trdbf\"" Apr 24 22:32:36.408249 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.408205 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 22:32:36.408422 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.408325 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 22:32:36.408422 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.408368 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 22:32:36.409167 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.408731 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 22:32:36.436098 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.436066 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-646b8cbfc-fqdkv"] Apr 24 22:32:36.497460 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.497460 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8gl4\" (UniqueName: \"kubernetes.io/projected/d71288e3-0a52-4c68-9d3b-643365a30e02-kube-api-access-b8gl4\") pod \"downloads-6bcc868b7-4rldg\" (UID: \"d71288e3-0a52-4c68-9d3b-643365a30e02\") " pod="openshift-console/downloads-6bcc868b7-4rldg" Apr 24 22:32:36.497674 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.497674 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-grpc-tls\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.497674 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255dh\" (UniqueName: \"kubernetes.io/projected/0c14d682-7850-4b94-aeda-73452c4cca01-kube-api-access-255dh\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.497674 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c14d682-7850-4b94-aeda-73452c4cca01-metrics-client-ca\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.497894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.497894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.497894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.497777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-tls\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.510562 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.510528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8gl4\" (UniqueName: \"kubernetes.io/projected/d71288e3-0a52-4c68-9d3b-643365a30e02-kube-api-access-b8gl4\") pod \"downloads-6bcc868b7-4rldg\" (UID: \"d71288e3-0a52-4c68-9d3b-643365a30e02\") " pod="openshift-console/downloads-6bcc868b7-4rldg" Apr 24 22:32:36.541827 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.541796 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4rldg" Apr 24 22:32:36.599012 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.598908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.599012 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.598947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.599012 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.598979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-tls\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.599269 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.599025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.599269 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.599111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.599269 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.599147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-grpc-tls\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.599269 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.599171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-255dh\" (UniqueName: \"kubernetes.io/projected/0c14d682-7850-4b94-aeda-73452c4cca01-kube-api-access-255dh\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.599269 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.599197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c14d682-7850-4b94-aeda-73452c4cca01-metrics-client-ca\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.600411 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.600200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c14d682-7850-4b94-aeda-73452c4cca01-metrics-client-ca\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.602723 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.602666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.604767 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.604607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-tls\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.605952 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.605928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-grpc-tls\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.607549 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.607151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.607549 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.607449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.607722 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.607697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c14d682-7850-4b94-aeda-73452c4cca01-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.610871 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.610848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-255dh\" (UniqueName: \"kubernetes.io/projected/0c14d682-7850-4b94-aeda-73452c4cca01-kube-api-access-255dh\") pod \"thanos-querier-646b8cbfc-fqdkv\" (UID: \"0c14d682-7850-4b94-aeda-73452c4cca01\") " pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.677495 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.677460 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4rldg"] Apr 24 22:32:36.704048 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.704006 2577 generic.go:358] "Generic (PLEG): container finished" podID="a56a8cc4-5a2f-4884-80eb-0c8d1dee9455" containerID="2847fa97c9774178991c4410b4ccd23ebdc6e3d142110304ca3f683e207f951f" exitCode=0 Apr 24 22:32:36.704192 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.704094 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jtkx" event={"ID":"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455","Type":"ContainerDied","Data":"2847fa97c9774178991c4410b4ccd23ebdc6e3d142110304ca3f683e207f951f"} Apr 24 22:32:36.706227 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.706198 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" event={"ID":"9f9e77c1-2948-4c92-9f23-a02949e0904b","Type":"ContainerStarted","Data":"fa90e04b36ff766667daa050a7c1370be5e5f9b3f43b839adc2c8d9168371780"} Apr 24 22:32:36.707486 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.707467 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerStarted","Data":"c6924d4b535a1d6fcd51e8ec30756cd241a56a01a3f282bfb5e6847971913cab"} Apr 24 22:32:36.713478 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.713449 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:36.758850 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.758723 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkm5w" podStartSLOduration=1.806258847 podStartE2EDuration="2.758699805s" podCreationTimestamp="2026-04-24 22:32:34 +0000 UTC" firstStartedPulling="2026-04-24 22:32:34.923078627 +0000 UTC m=+142.218424852" lastFinishedPulling="2026-04-24 22:32:35.875519589 +0000 UTC m=+143.170865810" observedRunningTime="2026-04-24 22:32:36.757650123 +0000 UTC m=+144.052996364" watchObservedRunningTime="2026-04-24 22:32:36.758699805 +0000 UTC m=+144.054046046" Apr 24 22:32:36.844109 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:36.844074 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd71288e3_0a52_4c68_9d3b_643365a30e02.slice/crio-d0ee1cdb5362bd8219300646ade1ca62feff1d5ef54a490b385a5ffcce66fdb4 WatchSource:0}: Error finding container d0ee1cdb5362bd8219300646ade1ca62feff1d5ef54a490b385a5ffcce66fdb4: Status 404 returned error can't find the container with id d0ee1cdb5362bd8219300646ade1ca62feff1d5ef54a490b385a5ffcce66fdb4 Apr 24 22:32:36.990765 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:36.990717 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-646b8cbfc-fqdkv"] Apr 24 22:32:37.035589 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:37.035511 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c14d682_7850_4b94_aeda_73452c4cca01.slice/crio-bbeb99b1b98839a8048f36fa1fce4b83195a4b1384c27324570ea12cf09953ce WatchSource:0}: Error finding container bbeb99b1b98839a8048f36fa1fce4b83195a4b1384c27324570ea12cf09953ce: Status 404 returned error can't find the container with id bbeb99b1b98839a8048f36fa1fce4b83195a4b1384c27324570ea12cf09953ce Apr 24 22:32:37.713312 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:37.713269 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4rldg" event={"ID":"d71288e3-0a52-4c68-9d3b-643365a30e02","Type":"ContainerStarted","Data":"d0ee1cdb5362bd8219300646ade1ca62feff1d5ef54a490b385a5ffcce66fdb4"} Apr 24 22:32:37.715888 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:37.715855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jtkx" event={"ID":"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455","Type":"ContainerStarted","Data":"0cea1b0d1446172f3dc1e62a42a9d4d1275b5e8e4604512bfcab643e92e4b660"} Apr 24 22:32:37.716009 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:37.715896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jtkx" event={"ID":"a56a8cc4-5a2f-4884-80eb-0c8d1dee9455","Type":"ContainerStarted","Data":"d87e4c7c46188b7aa5b6cedc3e71484c2b26cb77377ff875ec250bfe7756b7a6"} Apr 24 22:32:37.717763 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:37.717718 2577 generic.go:358] "Generic (PLEG): container finished" podID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerID="353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6" exitCode=0 Apr 24 22:32:37.717873 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:37.717850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6"} Apr 24 22:32:37.721952 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:37.721923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" event={"ID":"0c14d682-7850-4b94-aeda-73452c4cca01","Type":"ContainerStarted","Data":"bbeb99b1b98839a8048f36fa1fce4b83195a4b1384c27324570ea12cf09953ce"} Apr 24 22:32:37.738020 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:37.737966 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8jtkx" podStartSLOduration=2.7477460430000002 podStartE2EDuration="3.737948157s" podCreationTimestamp="2026-04-24 22:32:34 +0000 UTC" firstStartedPulling="2026-04-24 22:32:35.274203818 +0000 UTC m=+142.569550036" lastFinishedPulling="2026-04-24 22:32:36.264405931 +0000 UTC m=+143.559752150" observedRunningTime="2026-04-24 22:32:37.736840928 +0000 UTC m=+145.032187168" watchObservedRunningTime="2026-04-24 22:32:37.737948157 +0000 UTC m=+145.033294397" Apr 24 22:32:38.846142 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.845980 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fbf6988d5-lg6v2"] Apr 24 22:32:38.849577 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.849552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:38.852381 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.852353 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 22:32:38.852604 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.852582 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 22:32:38.852935 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.852910 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6bnk084hj22e5\"" Apr 24 22:32:38.854042 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.853567 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 22:32:38.854042 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.853925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 22:32:38.854042 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.854011 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-7898k\"" Apr 24 22:32:38.860213 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:38.860038 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fbf6988d5-lg6v2"] Apr 24 22:32:39.025022 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.024986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-client-ca-bundle\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.025193 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.025045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.025193 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.025134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-audit-log\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.025193 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.025179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpqlb\" (UniqueName: \"kubernetes.io/projected/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-kube-api-access-dpqlb\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.025358 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.025276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-secret-metrics-server-tls\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.025392 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.025373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-secret-metrics-server-client-certs\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.025432 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.025409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-metrics-server-audit-profiles\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.126950 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.126859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-client-ca-bundle\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.126950 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.126915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.127184 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.127111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-audit-log\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.127184 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.127167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpqlb\" (UniqueName: \"kubernetes.io/projected/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-kube-api-access-dpqlb\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.127278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.127222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-secret-metrics-server-tls\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.127334 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.127306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-secret-metrics-server-client-certs\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.127388 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.127339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-metrics-server-audit-profiles\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.127455 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.127428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-audit-log\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.127696 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.127667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.129040 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.129015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-metrics-server-audit-profiles\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.130361 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.130334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-secret-metrics-server-client-certs\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.131105 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.131059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-secret-metrics-server-tls\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.132192 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.132154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-client-ca-bundle\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.140309 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.140264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpqlb\" (UniqueName: \"kubernetes.io/projected/e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05-kube-api-access-dpqlb\") pod \"metrics-server-5fbf6988d5-lg6v2\" (UID: \"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05\") " pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.165082 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.164662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:39.679857 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.679823 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fbf6988d5-lg6v2"] Apr 24 22:32:39.683903 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:39.683872 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e1c5a9_e806_4eb4_8d76_8f9aa1803e05.slice/crio-851552279d7652e048297ff43be60d50e0c7cc63c3e93a9e0b3cccf457889b2f WatchSource:0}: Error finding container 851552279d7652e048297ff43be60d50e0c7cc63c3e93a9e0b3cccf457889b2f: Status 404 returned error can't find the container with id 851552279d7652e048297ff43be60d50e0c7cc63c3e93a9e0b3cccf457889b2f Apr 24 22:32:39.730331 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.730256 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" event={"ID":"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05","Type":"ContainerStarted","Data":"851552279d7652e048297ff43be60d50e0c7cc63c3e93a9e0b3cccf457889b2f"} Apr 24 22:32:39.734026 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.733984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerStarted","Data":"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74"} Apr 24 22:32:39.737332 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.737307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" event={"ID":"0c14d682-7850-4b94-aeda-73452c4cca01","Type":"ContainerStarted","Data":"462bf66c35f867e5968480e88240370982306d0a6edc086bc3fd8d3e684a96de"} Apr 24 22:32:39.737416 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:39.737338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" event={"ID":"0c14d682-7850-4b94-aeda-73452c4cca01","Type":"ContainerStarted","Data":"21e535f4599f9952e4e38e3bb9d4ae465718240c696ab6172ec50369a1728f47"} Apr 24 22:32:40.634089 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.631635 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:32:40.637035 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.637010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.639924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.639896 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 22:32:40.640088 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.640061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 22:32:40.640165 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.640153 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 22:32:40.640355 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.640329 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2ie52qm24gs8r\"" Apr 24 22:32:40.640542 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.640525 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 22:32:40.640614 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.640580 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 22:32:40.640714 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.640534 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 22:32:40.640863 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.640842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 22:32:40.641095 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.641080 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 22:32:40.641203 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.641184 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v7cxk\"" Apr 24 22:32:40.641275 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.641259 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 22:32:40.641436 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.641404 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 22:32:40.641538 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.641081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 22:32:40.643238 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.643216 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 22:32:40.649279 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.648908 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:32:40.743426 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743387 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743426 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743579 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743579 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743579 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743543 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743690 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743690 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743766 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgfrk\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-kube-api-access-lgfrk\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743766 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.743901 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.744099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.744099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.743990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.744099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.744020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.744099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.744047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.744099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.744075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.744099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.744101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.746278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.746161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerStarted","Data":"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5"} Apr 24 22:32:40.746278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.746206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerStarted","Data":"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb"} Apr 24 22:32:40.746278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.746221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerStarted","Data":"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d"} Apr 24 22:32:40.746278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.746237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerStarted","Data":"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921"} Apr 24 22:32:40.751428 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.751385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" event={"ID":"0c14d682-7850-4b94-aeda-73452c4cca01","Type":"ContainerStarted","Data":"d835dffef7fe5452ed65bff6f8295fe51c6e2aebb0ffa147850eb67fe42de1fd"} Apr 24 22:32:40.845204 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845373 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845373 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845373 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845373 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845373 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgfrk\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-kube-api-access-lgfrk\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.845652 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.846075 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.846075 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.846075 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.845788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.846279 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.846221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.847151 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.846819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.847661 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.847212 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.847805 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.847784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.850405 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.850379 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.850604 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.850581 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.851278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.851257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.851521 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.851499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.851985 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.851934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.852575 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.852550 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.853345 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.853283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.854434 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.853991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.854434 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.854397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.855234 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.855195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.855436 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.855415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.855522 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.855498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.857468 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.857450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.861232 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.861208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgfrk\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-kube-api-access-lgfrk\") pod \"prometheus-k8s-0\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:40.953643 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:40.953606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:41.339373 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.339338 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:32:41.341254 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:41.341182 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f11f4a_5570_4741_8e3f_dcb33449fcc2.slice/crio-98739c984366668af956c952ad0c1d323e2bbc92b642b875a20eba100f36362c WatchSource:0}: Error finding container 98739c984366668af956c952ad0c1d323e2bbc92b642b875a20eba100f36362c: Status 404 returned error can't find the container with id 98739c984366668af956c952ad0c1d323e2bbc92b642b875a20eba100f36362c Apr 24 22:32:41.757622 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.757577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" event={"ID":"0c14d682-7850-4b94-aeda-73452c4cca01","Type":"ContainerStarted","Data":"ce37fa6db394d8811a7ec3812dad3f50a17f93182c9c2c94d24ae36705fb904a"} Apr 24 22:32:41.757622 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.757620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" event={"ID":"0c14d682-7850-4b94-aeda-73452c4cca01","Type":"ContainerStarted","Data":"104e9471f23301b831b6dfd4fe8766021dfdb2e26ddfb4094a60a8da6e79431b"} Apr 24 22:32:41.758076 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.757635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" event={"ID":"0c14d682-7850-4b94-aeda-73452c4cca01","Type":"ContainerStarted","Data":"0079bf2615153b13e94d6fe68b4c2ef060f78a16eb32f5ee19139a9becf14432"} Apr 24 22:32:41.758273 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.758248 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:41.759314 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.759290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" event={"ID":"e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05","Type":"ContainerStarted","Data":"17520c7832ad53adcb56ba06b4e617289142e0b5cd1abd315dfc12111a78d3f2"} Apr 24 22:32:41.760721 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.760695 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" exitCode=0 Apr 24 22:32:41.760832 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.760784 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f"} Apr 24 22:32:41.760832 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.760813 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerStarted","Data":"98739c984366668af956c952ad0c1d323e2bbc92b642b875a20eba100f36362c"} Apr 24 22:32:41.764607 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.764577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerStarted","Data":"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b"} Apr 24 22:32:41.782511 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.782452 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" podStartSLOduration=2.142401135 podStartE2EDuration="5.78243231s" podCreationTimestamp="2026-04-24 22:32:36 +0000 UTC" firstStartedPulling="2026-04-24 22:32:37.0375778 +0000 UTC m=+144.332924018" lastFinishedPulling="2026-04-24 22:32:40.677608964 +0000 UTC m=+147.972955193" observedRunningTime="2026-04-24 22:32:41.780049374 +0000 UTC m=+149.075395614" watchObservedRunningTime="2026-04-24 22:32:41.78243231 +0000 UTC m=+149.077778555" Apr 24 22:32:41.803496 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.803435 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" podStartSLOduration=2.253866976 podStartE2EDuration="3.80342054s" podCreationTimestamp="2026-04-24 22:32:38 +0000 UTC" firstStartedPulling="2026-04-24 22:32:39.686244625 +0000 UTC m=+146.981590843" lastFinishedPulling="2026-04-24 22:32:41.235798169 +0000 UTC m=+148.531144407" observedRunningTime="2026-04-24 22:32:41.801971278 +0000 UTC m=+149.097317519" watchObservedRunningTime="2026-04-24 22:32:41.80342054 +0000 UTC m=+149.098766976" Apr 24 22:32:41.885456 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:41.885390 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.177359872 podStartE2EDuration="6.885372034s" podCreationTimestamp="2026-04-24 22:32:35 +0000 UTC" firstStartedPulling="2026-04-24 22:32:35.971182455 +0000 UTC m=+143.266528672" lastFinishedPulling="2026-04-24 22:32:40.679194607 +0000 UTC m=+147.974540834" observedRunningTime="2026-04-24 22:32:41.843239072 +0000 UTC m=+149.138585312" watchObservedRunningTime="2026-04-24 22:32:41.885372034 +0000 UTC m=+149.180718275" Apr 24 22:32:42.147925 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:42.147839 2577 patch_prober.go:28] interesting pod/image-registry-bfbcbcd4d-g6dfw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 22:32:42.147925 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:42.147895 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:32:45.796289 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.785458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerStarted","Data":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} Apr 24 22:32:45.796289 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.785506 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerStarted","Data":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} Apr 24 22:32:45.796289 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.785519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerStarted","Data":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} Apr 24 22:32:45.796289 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.785531 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerStarted","Data":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} Apr 24 22:32:45.796289 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.785543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerStarted","Data":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} Apr 24 22:32:45.796289 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.785555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerStarted","Data":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} Apr 24 22:32:45.817828 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.817767 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.822490153 podStartE2EDuration="5.817731963s" podCreationTimestamp="2026-04-24 22:32:40 +0000 UTC" firstStartedPulling="2026-04-24 22:32:41.76236989 +0000 UTC m=+149.057716110" lastFinishedPulling="2026-04-24 22:32:44.757611683 +0000 UTC m=+152.052957920" observedRunningTime="2026-04-24 22:32:45.814938952 +0000 UTC m=+153.110285194" watchObservedRunningTime="2026-04-24 22:32:45.817731963 +0000 UTC m=+153.113078203" Apr 24 22:32:45.953955 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:45.953916 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:32:47.162797 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:47.162713 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" containerID="cri-o://78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407" gracePeriod=30 Apr 24 22:32:48.779548 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:48.779521 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-646b8cbfc-fqdkv" Apr 24 22:32:49.692790 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:49.692720 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fl5t5" podUID="0fda03dc-9085-4272-aa29-583404383acf" Apr 24 22:32:49.702968 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:49.702926 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-48kr8" podUID="7cda4d53-f962-44ab-a661-325196e9edf2" Apr 24 22:32:49.798699 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:49.798664 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fl5t5" Apr 24 22:32:52.143451 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:52.143413 2577 patch_prober.go:28] interesting pod/image-registry-bfbcbcd4d-g6dfw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.133.0.12:5000/healthz\": dial tcp 10.133.0.12:5000: connect: connection refused" start-of-body= Apr 24 22:32:52.143914 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:52.143471 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" probeResult="failure" output="Get \"https://10.133.0.12:5000/healthz\": dial tcp 10.133.0.12:5000: connect: connection refused" Apr 24 22:32:53.810532 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.810512 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:53.812856 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.812829 2577 generic.go:358] "Generic (PLEG): container finished" podID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerID="78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407" exitCode=0 Apr 24 22:32:53.812941 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.812912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" event={"ID":"33723fd1-590b-4af7-ba53-6d39a38013fd","Type":"ContainerDied","Data":"78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407"} Apr 24 22:32:53.812994 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.812938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" event={"ID":"33723fd1-590b-4af7-ba53-6d39a38013fd","Type":"ContainerDied","Data":"e497d7806ac4136ff7dd8254a97d75082fcbbb76c42eced726aeae66d0c6473e"} Apr 24 22:32:53.812994 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.812954 2577 scope.go:117] "RemoveContainer" containerID="78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407" Apr 24 22:32:53.813074 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.813059 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bfbcbcd4d-g6dfw" Apr 24 22:32:53.820832 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.820726 2577 scope.go:117] "RemoveContainer" containerID="78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407" Apr 24 22:32:53.821074 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:32:53.821022 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407\": container with ID starting with 78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407 not found: ID does not exist" containerID="78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407" Apr 24 22:32:53.821135 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.821057 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407"} err="failed to get container status \"78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407\": rpc error: code = NotFound desc = could not find container \"78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407\": container with ID starting with 78fdb69bb36e5020f861fdd5cc735000e5854dd4c90e56ae770495fe32146407 not found: ID does not exist" Apr 24 22:32:53.880808 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.880730 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvhb8\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-kube-api-access-wvhb8\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.880975 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.880830 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-image-registry-private-configuration\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.880975 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.880870 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-certificates\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.880975 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.880900 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-installation-pull-secrets\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.881161 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.880987 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33723fd1-590b-4af7-ba53-6d39a38013fd-ca-trust-extracted\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.881161 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.881047 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-trusted-ca\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.881161 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.881070 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.881161 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.881105 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-bound-sa-token\") pod \"33723fd1-590b-4af7-ba53-6d39a38013fd\" (UID: \"33723fd1-590b-4af7-ba53-6d39a38013fd\") " Apr 24 22:32:53.882084 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.882003 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:53.882447 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.882419 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:53.883965 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.883868 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-kube-api-access-wvhb8" (OuterVolumeSpecName: "kube-api-access-wvhb8") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "kube-api-access-wvhb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:32:53.884134 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.884098 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:53.884469 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.884427 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:32:53.884865 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.884839 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:53.885147 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.885124 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:32:53.890537 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.890514 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33723fd1-590b-4af7-ba53-6d39a38013fd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "33723fd1-590b-4af7-ba53-6d39a38013fd" (UID: "33723fd1-590b-4af7-ba53-6d39a38013fd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:32:53.982784 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982683 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/33723fd1-590b-4af7-ba53-6d39a38013fd-ca-trust-extracted\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:53.982784 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982719 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-trusted-ca\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:53.982784 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982734 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-tls\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:53.982784 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982764 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-bound-sa-token\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:53.982784 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982777 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvhb8\" (UniqueName: \"kubernetes.io/projected/33723fd1-590b-4af7-ba53-6d39a38013fd-kube-api-access-wvhb8\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:53.982784 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982787 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-image-registry-private-configuration\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:53.983080 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982798 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/33723fd1-590b-4af7-ba53-6d39a38013fd-registry-certificates\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:53.983080 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:53.982807 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/33723fd1-590b-4af7-ba53-6d39a38013fd-installation-pull-secrets\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:32:54.138344 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.138304 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bfbcbcd4d-g6dfw"] Apr 24 22:32:54.143569 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.143533 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-bfbcbcd4d-g6dfw"] Apr 24 22:32:54.587881 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.587834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:32:54.588072 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.587965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:32:54.590795 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.590770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fda03dc-9085-4272-aa29-583404383acf-metrics-tls\") pod \"dns-default-fl5t5\" (UID: \"0fda03dc-9085-4272-aa29-583404383acf\") " pod="openshift-dns/dns-default-fl5t5" Apr 24 22:32:54.590916 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.590853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cda4d53-f962-44ab-a661-325196e9edf2-cert\") pod \"ingress-canary-48kr8\" (UID: \"7cda4d53-f962-44ab-a661-325196e9edf2\") " pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:32:54.602735 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.602706 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svq4d\"" Apr 24 22:32:54.610166 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.610132 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fl5t5" Apr 24 22:32:54.764935 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.764883 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fl5t5"] Apr 24 22:32:54.766954 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:32:54.766920 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fda03dc_9085_4272_aa29_583404383acf.slice/crio-b009887992ad4a0d24f3cf9e28b90d97491178e5e4322d8909781297b3f347ca WatchSource:0}: Error finding container b009887992ad4a0d24f3cf9e28b90d97491178e5e4322d8909781297b3f347ca: Status 404 returned error can't find the container with id b009887992ad4a0d24f3cf9e28b90d97491178e5e4322d8909781297b3f347ca Apr 24 22:32:54.818563 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.818516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fl5t5" event={"ID":"0fda03dc-9085-4272-aa29-583404383acf","Type":"ContainerStarted","Data":"b009887992ad4a0d24f3cf9e28b90d97491178e5e4322d8909781297b3f347ca"} Apr 24 22:32:54.821040 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.820484 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4rldg" event={"ID":"d71288e3-0a52-4c68-9d3b-643365a30e02","Type":"ContainerStarted","Data":"1b4ea6d8bfd6668c92cc9e1638561b36d47834a70379dd4688de83be1c2c3259"} Apr 24 22:32:54.821040 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.820902 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-4rldg" Apr 24 22:32:54.836560 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.836529 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-4rldg" Apr 24 22:32:54.840084 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:54.839992 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-4rldg" podStartSLOduration=1.9290105 podStartE2EDuration="18.83997602s" podCreationTimestamp="2026-04-24 22:32:36 +0000 UTC" firstStartedPulling="2026-04-24 22:32:36.846561679 +0000 UTC m=+144.141907897" lastFinishedPulling="2026-04-24 22:32:53.7575272 +0000 UTC m=+161.052873417" observedRunningTime="2026-04-24 22:32:54.838605546 +0000 UTC m=+162.133951809" watchObservedRunningTime="2026-04-24 22:32:54.83997602 +0000 UTC m=+162.135322262" Apr 24 22:32:55.265265 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:55.265223 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" path="/var/lib/kubelet/pods/33723fd1-590b-4af7-ba53-6d39a38013fd/volumes" Apr 24 22:32:57.832881 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:57.832840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fl5t5" event={"ID":"0fda03dc-9085-4272-aa29-583404383acf","Type":"ContainerStarted","Data":"d6da73035338887a6c15faa63d43db71211d349b4a18457c01263e246c822f6d"} Apr 24 22:32:57.833338 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:57.832890 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fl5t5" event={"ID":"0fda03dc-9085-4272-aa29-583404383acf","Type":"ContainerStarted","Data":"de41d80de838907a1c4f926dfee6ed17dd731018b91335a42eb7098e5402dea6"} Apr 24 22:32:57.833338 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:57.833048 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fl5t5" Apr 24 22:32:57.850882 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:57.850837 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fl5t5" podStartSLOduration=129.768457732 podStartE2EDuration="2m11.850815642s" podCreationTimestamp="2026-04-24 22:30:46 +0000 UTC" firstStartedPulling="2026-04-24 22:32:54.7693367 +0000 UTC m=+162.064682922" lastFinishedPulling="2026-04-24 22:32:56.851694613 +0000 UTC m=+164.147040832" observedRunningTime="2026-04-24 22:32:57.849677389 +0000 UTC m=+165.145023640" watchObservedRunningTime="2026-04-24 22:32:57.850815642 +0000 UTC m=+165.146161886" Apr 24 22:32:59.165274 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:59.165235 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:59.165710 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:59.165319 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:32:59.840323 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:59.840287 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2102ed1-69af-409e-b44a-dad1845de530" containerID="ee39b17690ea8da45eca63f9f691db820f1665d71062a65b4f04e326d2995df1" exitCode=0 Apr 24 22:32:59.840519 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:59.840367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lj9nb" event={"ID":"c2102ed1-69af-409e-b44a-dad1845de530","Type":"ContainerDied","Data":"ee39b17690ea8da45eca63f9f691db820f1665d71062a65b4f04e326d2995df1"} Apr 24 22:32:59.840963 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:32:59.840936 2577 scope.go:117] "RemoveContainer" containerID="ee39b17690ea8da45eca63f9f691db820f1665d71062a65b4f04e326d2995df1" Apr 24 22:33:00.845675 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:00.845641 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lj9nb" event={"ID":"c2102ed1-69af-409e-b44a-dad1845de530","Type":"ContainerStarted","Data":"9a59fdf5ae51e205634bdb0220f33d2ca37f79ffcd77e716e09d39b9f5c6877e"} Apr 24 22:33:01.504032 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:01.503998 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fl5t5_0fda03dc-9085-4272-aa29-583404383acf/dns/0.log" Apr 24 22:33:01.707526 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:01.707488 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fl5t5_0fda03dc-9085-4272-aa29-583404383acf/kube-rbac-proxy/0.log" Apr 24 22:33:02.303904 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:02.303878 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4v6p7_92fe7641-e2dc-499a-a25d-09cdcdac368b/dns-node-resolver/0.log" Apr 24 22:33:04.259676 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:04.259644 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:33:04.263558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:04.263539 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdthp\"" Apr 24 22:33:04.270819 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:04.270798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-48kr8" Apr 24 22:33:04.394013 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:04.393976 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-48kr8"] Apr 24 22:33:04.396387 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:33:04.396351 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cda4d53_f962_44ab_a661_325196e9edf2.slice/crio-8a19709b47ec6bad7352598834533ae6ebd9ec3a6e61d4f376a9c6b8bd7b245f WatchSource:0}: Error finding container 8a19709b47ec6bad7352598834533ae6ebd9ec3a6e61d4f376a9c6b8bd7b245f: Status 404 returned error can't find the container with id 8a19709b47ec6bad7352598834533ae6ebd9ec3a6e61d4f376a9c6b8bd7b245f Apr 24 22:33:04.859186 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:04.859140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-48kr8" event={"ID":"7cda4d53-f962-44ab-a661-325196e9edf2","Type":"ContainerStarted","Data":"8a19709b47ec6bad7352598834533ae6ebd9ec3a6e61d4f376a9c6b8bd7b245f"} Apr 24 22:33:05.864799 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:05.864737 2577 generic.go:358] "Generic (PLEG): container finished" podID="5c42aa5d-90ad-4392-9988-c472ad007628" containerID="28984f9e74993e1ca9c5ad6b56ac8ca3e821f0291290b74dcdca62a4ca0dab1d" exitCode=0 Apr 24 22:33:05.865244 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:05.864860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" event={"ID":"5c42aa5d-90ad-4392-9988-c472ad007628","Type":"ContainerDied","Data":"28984f9e74993e1ca9c5ad6b56ac8ca3e821f0291290b74dcdca62a4ca0dab1d"} Apr 24 22:33:05.865341 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:05.865324 2577 scope.go:117] "RemoveContainer" containerID="28984f9e74993e1ca9c5ad6b56ac8ca3e821f0291290b74dcdca62a4ca0dab1d" Apr 24 22:33:06.869607 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:06.869565 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fxkzv" event={"ID":"5c42aa5d-90ad-4392-9988-c472ad007628","Type":"ContainerStarted","Data":"be933a07400db49fd935a6b110808a23559eddbefc4328f05e2255ca0a3cc642"} Apr 24 22:33:06.870925 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:06.870891 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-48kr8" event={"ID":"7cda4d53-f962-44ab-a661-325196e9edf2","Type":"ContainerStarted","Data":"923f129180c97ff32d2852894f706870dce7f3930c9f739113ee9fb97789ebad"} Apr 24 22:33:06.910391 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:06.910217 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-48kr8" podStartSLOduration=138.99399516 podStartE2EDuration="2m20.910197521s" podCreationTimestamp="2026-04-24 22:30:46 +0000 UTC" firstStartedPulling="2026-04-24 22:33:04.401455803 +0000 UTC m=+171.696802020" lastFinishedPulling="2026-04-24 22:33:06.317658156 +0000 UTC m=+173.613004381" observedRunningTime="2026-04-24 22:33:06.910112938 +0000 UTC m=+174.205459192" watchObservedRunningTime="2026-04-24 22:33:06.910197521 +0000 UTC m=+174.205543762" Apr 24 22:33:07.838353 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:07.838319 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fl5t5" Apr 24 22:33:19.170542 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:19.170512 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:33:19.174567 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:19.174541 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fbf6988d5-lg6v2" Apr 24 22:33:40.954000 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:40.953967 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:41.006799 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:41.006770 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:41.022305 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:41.022279 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:54.734267 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:54.734230 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:54.734683 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:54.734658 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="alertmanager" containerID="cri-o://90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74" gracePeriod=120 Apr 24 22:33:54.734780 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:54.734717 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-metric" containerID="cri-o://52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5" gracePeriod=120 Apr 24 22:33:54.734923 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:54.734828 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy" containerID="cri-o://8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb" gracePeriod=120 Apr 24 22:33:54.734923 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:54.734864 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="config-reloader" containerID="cri-o://454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921" gracePeriod=120 Apr 24 22:33:54.734923 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:54.734783 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="prom-label-proxy" containerID="cri-o://0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b" gracePeriod=120 Apr 24 22:33:54.735062 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:54.734812 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-web" containerID="cri-o://0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d" gracePeriod=120 Apr 24 22:33:55.018354 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018267 2577 generic.go:358] "Generic (PLEG): container finished" podID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerID="0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b" exitCode=0 Apr 24 22:33:55.018354 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018294 2577 generic.go:358] "Generic (PLEG): container finished" podID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerID="8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb" exitCode=0 Apr 24 22:33:55.018354 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018301 2577 generic.go:358] "Generic (PLEG): container finished" podID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerID="454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921" exitCode=0 Apr 24 22:33:55.018354 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018308 2577 generic.go:358] "Generic (PLEG): container finished" podID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerID="90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74" exitCode=0 Apr 24 22:33:55.018354 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b"} Apr 24 22:33:55.018609 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb"} Apr 24 22:33:55.018609 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018386 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921"} Apr 24 22:33:55.018609 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.018395 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74"} Apr 24 22:33:55.973636 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:55.973615 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.026663 2577 generic.go:358] "Generic (PLEG): container finished" podID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerID="52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5" exitCode=0 Apr 24 22:33:56.026693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.026687 2577 generic.go:358] "Generic (PLEG): container finished" podID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerID="0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d" exitCode=0 Apr 24 22:33:56.026933 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.026758 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5"} Apr 24 22:33:56.026933 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.026787 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.026933 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.026797 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d"} Apr 24 22:33:56.026933 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.026811 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d385150b-b9ae-4267-852b-d4bf79c4b9ab","Type":"ContainerDied","Data":"c6924d4b535a1d6fcd51e8ec30756cd241a56a01a3f282bfb5e6847971913cab"} Apr 24 22:33:56.026933 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.026826 2577 scope.go:117] "RemoveContainer" containerID="0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b" Apr 24 22:33:56.028832 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.028813 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-trusted-ca-bundle\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.028941 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.028840 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-tls-assets\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.028941 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.028865 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-volume\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.028941 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.028880 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-web-config\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.028941 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.028909 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029146 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.028968 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-web\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029146 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.028999 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029146 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029031 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkzbq\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-kube-api-access-mkzbq\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029146 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029054 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-metrics-client-ca\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029350 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029146 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-main-tls\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029350 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029183 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-main-db\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029350 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029214 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-out\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029350 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029302 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-cluster-tls-config\") pod \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\" (UID: \"d385150b-b9ae-4267-852b-d4bf79c4b9ab\") " Apr 24 22:33:56.029552 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029488 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:56.030163 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029917 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:33:56.030163 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.029922 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:56.030321 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.030278 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-metrics-client-ca\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.030378 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.030332 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-main-db\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.030459 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.030437 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d385150b-b9ae-4267-852b-d4bf79c4b9ab-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.033971 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.033910 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-out" (OuterVolumeSpecName: "config-out") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:33:56.033971 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.033928 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:56.034515 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.034487 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:56.034683 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.034657 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-kube-api-access-mkzbq" (OuterVolumeSpecName: "kube-api-access-mkzbq") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "kube-api-access-mkzbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:56.034774 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.034734 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:56.034948 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.034910 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:56.035068 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.035044 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:56.035134 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.035061 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:56.036702 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.036685 2577 scope.go:117] "RemoveContainer" containerID="52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5" Apr 24 22:33:56.040538 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.040497 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:56.047333 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.047309 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-web-config" (OuterVolumeSpecName: "web-config") pod "d385150b-b9ae-4267-852b-d4bf79c4b9ab" (UID: "d385150b-b9ae-4267-852b-d4bf79c4b9ab"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:56.048441 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.048424 2577 scope.go:117] "RemoveContainer" containerID="8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb" Apr 24 22:33:56.056261 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.056208 2577 scope.go:117] "RemoveContainer" containerID="0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d" Apr 24 22:33:56.062409 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.062392 2577 scope.go:117] "RemoveContainer" containerID="454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921" Apr 24 22:33:56.068767 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.068723 2577 scope.go:117] "RemoveContainer" containerID="90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74" Apr 24 22:33:56.075130 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.075114 2577 scope.go:117] "RemoveContainer" containerID="353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6" Apr 24 22:33:56.081341 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.081325 2577 scope.go:117] "RemoveContainer" containerID="0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b" Apr 24 22:33:56.081609 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:33:56.081588 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b\": container with ID starting with 0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b not found: ID does not exist" containerID="0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b" Apr 24 22:33:56.081664 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.081619 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b"} err="failed to get container status \"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b\": rpc error: code = NotFound desc = could not find container \"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b\": container with ID starting with 0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b not found: ID does not exist" Apr 24 22:33:56.081664 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.081639 2577 scope.go:117] "RemoveContainer" containerID="52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5" Apr 24 22:33:56.081884 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:33:56.081868 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5\": container with ID starting with 52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5 not found: ID does not exist" containerID="52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5" Apr 24 22:33:56.081947 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.081888 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5"} err="failed to get container status \"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5\": rpc error: code = NotFound desc = could not find container \"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5\": container with ID starting with 52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5 not found: ID does not exist" Apr 24 22:33:56.081947 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.081902 2577 scope.go:117] "RemoveContainer" containerID="8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb" Apr 24 22:33:56.082117 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:33:56.082098 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb\": container with ID starting with 8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb not found: ID does not exist" containerID="8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb" Apr 24 22:33:56.082161 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082126 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb"} err="failed to get container status \"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb\": rpc error: code = NotFound desc = could not find container \"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb\": container with ID starting with 8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb not found: ID does not exist" Apr 24 22:33:56.082161 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082143 2577 scope.go:117] "RemoveContainer" containerID="0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d" Apr 24 22:33:56.082336 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:33:56.082322 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d\": container with ID starting with 0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d not found: ID does not exist" containerID="0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d" Apr 24 22:33:56.082381 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082338 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d"} err="failed to get container status \"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d\": rpc error: code = NotFound desc = could not find container \"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d\": container with ID starting with 0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d not found: ID does not exist" Apr 24 22:33:56.082381 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082349 2577 scope.go:117] "RemoveContainer" containerID="454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921" Apr 24 22:33:56.082536 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:33:56.082518 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921\": container with ID starting with 454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921 not found: ID does not exist" containerID="454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921" Apr 24 22:33:56.082599 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082545 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921"} err="failed to get container status \"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921\": rpc error: code = NotFound desc = could not find container \"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921\": container with ID starting with 454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921 not found: ID does not exist" Apr 24 22:33:56.082599 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082567 2577 scope.go:117] "RemoveContainer" containerID="90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74" Apr 24 22:33:56.082817 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:33:56.082775 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74\": container with ID starting with 90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74 not found: ID does not exist" containerID="90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74" Apr 24 22:33:56.082817 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082807 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74"} err="failed to get container status \"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74\": rpc error: code = NotFound desc = could not find container \"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74\": container with ID starting with 90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74 not found: ID does not exist" Apr 24 22:33:56.082905 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.082824 2577 scope.go:117] "RemoveContainer" containerID="353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6" Apr 24 22:33:56.083063 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:33:56.083047 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6\": container with ID starting with 353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6 not found: ID does not exist" containerID="353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6" Apr 24 22:33:56.083127 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083070 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6"} err="failed to get container status \"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6\": rpc error: code = NotFound desc = could not find container \"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6\": container with ID starting with 353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6 not found: ID does not exist" Apr 24 22:33:56.083127 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083090 2577 scope.go:117] "RemoveContainer" containerID="0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b" Apr 24 22:33:56.083320 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083302 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b"} err="failed to get container status \"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b\": rpc error: code = NotFound desc = could not find container \"0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b\": container with ID starting with 0568df0c4ef2a524baa2569e8fa8b05f3d5ed2ea5694f6621cc2f3c4dc803c2b not found: ID does not exist" Apr 24 22:33:56.083358 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083322 2577 scope.go:117] "RemoveContainer" containerID="52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5" Apr 24 22:33:56.083513 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083497 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5"} err="failed to get container status \"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5\": rpc error: code = NotFound desc = could not find container \"52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5\": container with ID starting with 52ffba8695eeb0188067665ee1e95764147bbb407093eea78c26841fc8621cc5 not found: ID does not exist" Apr 24 22:33:56.083555 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083513 2577 scope.go:117] "RemoveContainer" containerID="8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb" Apr 24 22:33:56.083701 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083683 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb"} err="failed to get container status \"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb\": rpc error: code = NotFound desc = could not find container \"8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb\": container with ID starting with 8c82980129e34498d06dc84ade08a75dd3b77d4d558c23cc49fcd337ab8d2deb not found: ID does not exist" Apr 24 22:33:56.083782 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083702 2577 scope.go:117] "RemoveContainer" containerID="0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d" Apr 24 22:33:56.083910 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083890 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d"} err="failed to get container status \"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d\": rpc error: code = NotFound desc = could not find container \"0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d\": container with ID starting with 0c749dae23dc911e593f3f11fa29423a34f6d55d423fc516348829301e3c850d not found: ID does not exist" Apr 24 22:33:56.083973 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.083911 2577 scope.go:117] "RemoveContainer" containerID="454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921" Apr 24 22:33:56.084120 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.084104 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921"} err="failed to get container status \"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921\": rpc error: code = NotFound desc = could not find container \"454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921\": container with ID starting with 454a55bcbeafb2875505977f5dad198c3d9c2ea2dee2f49b5bfdfcafaaf53921 not found: ID does not exist" Apr 24 22:33:56.084120 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.084119 2577 scope.go:117] "RemoveContainer" containerID="90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74" Apr 24 22:33:56.084311 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.084293 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74"} err="failed to get container status \"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74\": rpc error: code = NotFound desc = could not find container \"90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74\": container with ID starting with 90b57353ba6048b58a97b04a9f152fccf161d2b3de17f5f7b110246fdb0ffa74 not found: ID does not exist" Apr 24 22:33:56.084311 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.084310 2577 scope.go:117] "RemoveContainer" containerID="353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6" Apr 24 22:33:56.084464 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.084449 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6"} err="failed to get container status \"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6\": rpc error: code = NotFound desc = could not find container \"353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6\": container with ID starting with 353d2877b207bd2c910084bac6cc417868b4e5fb55e415799dde223566d573d6 not found: ID does not exist" Apr 24 22:33:56.131168 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131144 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-cluster-tls-config\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131168 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131166 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-tls-assets\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131175 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-volume\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131183 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-web-config\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131192 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131202 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131211 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131220 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkzbq\" (UniqueName: \"kubernetes.io/projected/d385150b-b9ae-4267-852b-d4bf79c4b9ab-kube-api-access-mkzbq\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131230 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d385150b-b9ae-4267-852b-d4bf79c4b9ab-secret-alertmanager-main-tls\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.131296 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.131239 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d385150b-b9ae-4267-852b-d4bf79c4b9ab-config-out\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:56.350362 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.350338 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:56.356004 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.355982 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:56.392830 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.392786 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:56.393084 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393071 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="init-config-reloader" Apr 24 22:33:56.393132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393085 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="init-config-reloader" Apr 24 22:33:56.393132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393099 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="alertmanager" Apr 24 22:33:56.393132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393104 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="alertmanager" Apr 24 22:33:56.393132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393111 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="config-reloader" Apr 24 22:33:56.393132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393116 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="config-reloader" Apr 24 22:33:56.393132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393131 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="prom-label-proxy" Apr 24 22:33:56.393132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393136 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="prom-label-proxy" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393142 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393147 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393153 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393158 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393165 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-web" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393178 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-web" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393186 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-metric" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393191 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-metric" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393230 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393238 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-metric" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393243 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="prom-label-proxy" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393251 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="kube-rbac-proxy-web" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393263 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="config-reloader" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393270 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" containerName="alertmanager" Apr 24 22:33:56.393389 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.393277 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="33723fd1-590b-4af7-ba53-6d39a38013fd" containerName="registry" Apr 24 22:33:56.402151 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.402126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.405110 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405086 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 22:33:56.405230 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 22:33:56.405230 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405162 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 22:33:56.405351 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405258 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 22:33:56.405404 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405389 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 22:33:56.405494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405475 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 22:33:56.405967 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405952 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 22:33:56.406051 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.405954 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 22:33:56.406051 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.406025 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-rqn8z\"" Apr 24 22:33:56.412091 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.412068 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:56.414722 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.414705 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 22:33:56.533479 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e94ba290-a213-4ae9-a127-b390467cc4c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533641 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-web-config\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533641 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533641 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e94ba290-a213-4ae9-a127-b390467cc4c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533641 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533641 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533860 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptlr\" (UniqueName: \"kubernetes.io/projected/e94ba290-a213-4ae9-a127-b390467cc4c9-kube-api-access-7ptlr\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533860 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e94ba290-a213-4ae9-a127-b390467cc4c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533860 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e94ba290-a213-4ae9-a127-b390467cc4c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533860 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533860 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533860 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e94ba290-a213-4ae9-a127-b390467cc4c9-config-out\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.533860 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.533859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634493 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e94ba290-a213-4ae9-a127-b390467cc4c9-config-out\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634493 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634493 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e94ba290-a213-4ae9-a127-b390467cc4c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-web-config\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e94ba290-a213-4ae9-a127-b390467cc4c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptlr\" (UniqueName: \"kubernetes.io/projected/e94ba290-a213-4ae9-a127-b390467cc4c9-kube-api-access-7ptlr\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e94ba290-a213-4ae9-a127-b390467cc4c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.634741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e94ba290-a213-4ae9-a127-b390467cc4c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.635168 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.635168 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.635168 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.634942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e94ba290-a213-4ae9-a127-b390467cc4c9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.635348 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.635323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e94ba290-a213-4ae9-a127-b390467cc4c9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.635773 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.635702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e94ba290-a213-4ae9-a127-b390467cc4c9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638041 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.637997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e94ba290-a213-4ae9-a127-b390467cc4c9-config-out\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638245 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.638221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638245 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.638234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-config-volume\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638361 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.638315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-web-config\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638409 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.638356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638409 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.638356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e94ba290-a213-4ae9-a127-b390467cc4c9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638510 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.638494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.638787 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.638772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.639424 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.639408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e94ba290-a213-4ae9-a127-b390467cc4c9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.643459 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.643441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptlr\" (UniqueName: \"kubernetes.io/projected/e94ba290-a213-4ae9-a127-b390467cc4c9-kube-api-access-7ptlr\") pod \"alertmanager-main-0\" (UID: \"e94ba290-a213-4ae9-a127-b390467cc4c9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.713212 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.713173 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:33:56.840642 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:56.840617 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:33:56.843085 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:33:56.843060 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode94ba290_a213_4ae9_a127_b390467cc4c9.slice/crio-b557d795998137be680cb893f9f546c2cb7b5598efc27e50d323b85c8f9292da WatchSource:0}: Error finding container b557d795998137be680cb893f9f546c2cb7b5598efc27e50d323b85c8f9292da: Status 404 returned error can't find the container with id b557d795998137be680cb893f9f546c2cb7b5598efc27e50d323b85c8f9292da Apr 24 22:33:57.031686 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:57.031654 2577 generic.go:358] "Generic (PLEG): container finished" podID="e94ba290-a213-4ae9-a127-b390467cc4c9" containerID="437ceb69e6986e567d1922a9b26a791025e7b51b87dfa577e4a13c43f6a59a60" exitCode=0 Apr 24 22:33:57.032083 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:57.031694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerDied","Data":"437ceb69e6986e567d1922a9b26a791025e7b51b87dfa577e4a13c43f6a59a60"} Apr 24 22:33:57.032083 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:57.031719 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerStarted","Data":"b557d795998137be680cb893f9f546c2cb7b5598efc27e50d323b85c8f9292da"} Apr 24 22:33:57.264075 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:57.264030 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d385150b-b9ae-4267-852b-d4bf79c4b9ab" path="/var/lib/kubelet/pods/d385150b-b9ae-4267-852b-d4bf79c4b9ab/volumes" Apr 24 22:33:58.037690 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.037654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerStarted","Data":"c62147a8ecccc171b8d6c6cbf2699339d949092a9b2a12d9397dd525d3c376c9"} Apr 24 22:33:58.037690 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.037694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerStarted","Data":"612ee69911f60575f27fa98acb986e89cac926b87593e2427c1a9d2735066138"} Apr 24 22:33:58.038101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.037704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerStarted","Data":"a344c7e8f0374a3976834d4840506668037fe6b2b2c44a5ea81feec936720295"} Apr 24 22:33:58.038101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.037714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerStarted","Data":"3687289f6d0633cc4113e031379aea12c0ed438242ce0ddc9737d7f17269afce"} Apr 24 22:33:58.038101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.037723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerStarted","Data":"a04b8f5b6978b7067ad13e57d856920df5fa781dbb61e3599832d566c76dab24"} Apr 24 22:33:58.038101 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.037731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e94ba290-a213-4ae9-a127-b390467cc4c9","Type":"ContainerStarted","Data":"02439f1964ee152cd270a74edeaaf258fc21daac1ec3230aae072a8fe03b229c"} Apr 24 22:33:58.064304 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.064218 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.064204015 podStartE2EDuration="2.064204015s" podCreationTimestamp="2026-04-24 22:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:33:58.061569391 +0000 UTC m=+225.356915631" watchObservedRunningTime="2026-04-24 22:33:58.064204015 +0000 UTC m=+225.359550255" Apr 24 22:33:58.956030 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.955993 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:33:58.956492 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.956445 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="prometheus" containerID="cri-o://8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" gracePeriod=600 Apr 24 22:33:58.956580 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.956483 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy" containerID="cri-o://67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" gracePeriod=600 Apr 24 22:33:58.956580 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.956543 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-web" containerID="cri-o://4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" gracePeriod=600 Apr 24 22:33:58.956711 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.956540 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="config-reloader" containerID="cri-o://c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" gracePeriod=600 Apr 24 22:33:58.956711 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.956511 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="thanos-sidecar" containerID="cri-o://9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" gracePeriod=600 Apr 24 22:33:58.956869 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:58.956684 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-thanos" containerID="cri-o://fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" gracePeriod=600 Apr 24 22:33:59.221528 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.221504 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:33:59.358218 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358181 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-tls-assets\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358218 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358221 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358239 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-grpc-tls\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358257 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-kubelet-serving-ca-bundle\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358276 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-rulefiles-0\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358308 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-serving-certs-ca-bundle\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358335 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358355 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-thanos-prometheus-http-client-file\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358382 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-trusted-ca-bundle\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358409 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgfrk\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-kube-api-access-lgfrk\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358463 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358450 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-tls\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358481 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-kube-rbac-proxy\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358506 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-metrics-client-ca\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358559 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-web-config\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358588 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358631 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-metrics-client-certs\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358660 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config-out\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358701 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-db\") pod \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\" (UID: \"c2f11f4a-5570-4741-8e3f-dcb33449fcc2\") " Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358783 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:59.358912 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.358803 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:59.359555 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.359046 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.359555 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.359068 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.359555 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.359217 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:59.359555 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.359226 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:59.360403 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.360308 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:59.360403 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.360374 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:33:59.362081 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.362041 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-kube-api-access-lgfrk" (OuterVolumeSpecName: "kube-api-access-lgfrk") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "kube-api-access-lgfrk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:59.363143 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.362875 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.363143 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363062 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.363143 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363102 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config" (OuterVolumeSpecName: "config") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.363378 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363148 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.363378 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363165 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config-out" (OuterVolumeSpecName: "config-out") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:33:59.363636 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363611 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.363716 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363617 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:59.363716 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363691 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.363851 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.363832 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.364341 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.364329 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.373738 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.373716 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-web-config" (OuterVolumeSpecName: "web-config") pod "c2f11f4a-5570-4741-8e3f-dcb33449fcc2" (UID: "c2f11f4a-5570-4741-8e3f-dcb33449fcc2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:59.459440 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459416 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-web-config\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459440 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459442 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459456 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-metrics-client-certs\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459474 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config-out\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459484 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-db\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459493 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-tls-assets\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459503 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-config\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459511 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-grpc-tls\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459521 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459531 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459545 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459557 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459566 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lgfrk\" (UniqueName: \"kubernetes.io/projected/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-kube-api-access-lgfrk\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459575 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459583 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459584 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-secret-kube-rbac-proxy\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:33:59.459984 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:33:59.459593 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f11f4a-5570-4741-8e3f-dcb33449fcc2-configmap-metrics-client-ca\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:34:00.046972 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.046935 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" exitCode=0 Apr 24 22:34:00.046972 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.046964 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" exitCode=0 Apr 24 22:34:00.046972 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.046970 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" exitCode=0 Apr 24 22:34:00.046972 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.046976 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" exitCode=0 Apr 24 22:34:00.046972 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.046981 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" exitCode=0 Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.046986 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" exitCode=0 Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047032 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047059 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c2f11f4a-5570-4741-8e3f-dcb33449fcc2","Type":"ContainerDied","Data":"98739c984366668af956c952ad0c1d323e2bbc92b642b875a20eba100f36362c"} Apr 24 22:34:00.047250 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.047121 2577 scope.go:117] "RemoveContainer" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.054921 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.054907 2577 scope.go:117] "RemoveContainer" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.062129 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.062111 2577 scope.go:117] "RemoveContainer" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.068940 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.068918 2577 scope.go:117] "RemoveContainer" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.070216 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.070198 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:34:00.076367 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.076294 2577 scope.go:117] "RemoveContainer" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.077731 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.077714 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:34:00.083132 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.083116 2577 scope.go:117] "RemoveContainer" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.090025 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.090009 2577 scope.go:117] "RemoveContainer" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.096638 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.096621 2577 scope.go:117] "RemoveContainer" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.096925 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:34:00.096904 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": container with ID starting with fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c not found: ID does not exist" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.096980 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.096936 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} err="failed to get container status \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": rpc error: code = NotFound desc = could not find container \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": container with ID starting with fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c not found: ID does not exist" Apr 24 22:34:00.096980 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.096954 2577 scope.go:117] "RemoveContainer" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.097197 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:34:00.097177 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": container with ID starting with 67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a not found: ID does not exist" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.097264 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.097208 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} err="failed to get container status \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": rpc error: code = NotFound desc = could not find container \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": container with ID starting with 67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a not found: ID does not exist" Apr 24 22:34:00.097264 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.097231 2577 scope.go:117] "RemoveContainer" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.097466 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:34:00.097439 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": container with ID starting with 4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41 not found: ID does not exist" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.097508 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.097476 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} err="failed to get container status \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": rpc error: code = NotFound desc = could not find container \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": container with ID starting with 4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41 not found: ID does not exist" Apr 24 22:34:00.097508 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.097492 2577 scope.go:117] "RemoveContainer" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.097705 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:34:00.097688 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": container with ID starting with 9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf not found: ID does not exist" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.097761 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.097709 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} err="failed to get container status \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": rpc error: code = NotFound desc = could not find container \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": container with ID starting with 9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf not found: ID does not exist" Apr 24 22:34:00.097761 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.097724 2577 scope.go:117] "RemoveContainer" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.098002 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:34:00.097988 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": container with ID starting with c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e not found: ID does not exist" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.098043 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098006 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} err="failed to get container status \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": rpc error: code = NotFound desc = could not find container \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": container with ID starting with c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e not found: ID does not exist" Apr 24 22:34:00.098043 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098018 2577 scope.go:117] "RemoveContainer" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.098257 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:34:00.098223 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": container with ID starting with 8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19 not found: ID does not exist" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.098328 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098250 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} err="failed to get container status \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": rpc error: code = NotFound desc = could not find container \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": container with ID starting with 8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19 not found: ID does not exist" Apr 24 22:34:00.098328 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098269 2577 scope.go:117] "RemoveContainer" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.098507 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:34:00.098491 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": container with ID starting with 6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f not found: ID does not exist" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.098558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098512 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f"} err="failed to get container status \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": rpc error: code = NotFound desc = could not find container \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": container with ID starting with 6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f not found: ID does not exist" Apr 24 22:34:00.098558 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098525 2577 scope.go:117] "RemoveContainer" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.098732 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098716 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} err="failed to get container status \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": rpc error: code = NotFound desc = could not find container \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": container with ID starting with fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c not found: ID does not exist" Apr 24 22:34:00.098786 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098733 2577 scope.go:117] "RemoveContainer" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.098974 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098956 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} err="failed to get container status \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": rpc error: code = NotFound desc = could not find container \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": container with ID starting with 67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a not found: ID does not exist" Apr 24 22:34:00.099018 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.098974 2577 scope.go:117] "RemoveContainer" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.099186 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099167 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} err="failed to get container status \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": rpc error: code = NotFound desc = could not find container \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": container with ID starting with 4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41 not found: ID does not exist" Apr 24 22:34:00.099244 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099188 2577 scope.go:117] "RemoveContainer" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.099417 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099389 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} err="failed to get container status \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": rpc error: code = NotFound desc = could not find container \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": container with ID starting with 9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf not found: ID does not exist" Apr 24 22:34:00.099417 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099407 2577 scope.go:117] "RemoveContainer" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.099614 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099597 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} err="failed to get container status \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": rpc error: code = NotFound desc = could not find container \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": container with ID starting with c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e not found: ID does not exist" Apr 24 22:34:00.099614 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099614 2577 scope.go:117] "RemoveContainer" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.099868 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099848 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} err="failed to get container status \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": rpc error: code = NotFound desc = could not find container \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": container with ID starting with 8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19 not found: ID does not exist" Apr 24 22:34:00.099914 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.099868 2577 scope.go:117] "RemoveContainer" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.100099 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100081 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f"} err="failed to get container status \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": rpc error: code = NotFound desc = could not find container \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": container with ID starting with 6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f not found: ID does not exist" Apr 24 22:34:00.100156 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100101 2577 scope.go:117] "RemoveContainer" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.100304 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100289 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} err="failed to get container status \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": rpc error: code = NotFound desc = could not find container \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": container with ID starting with fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c not found: ID does not exist" Apr 24 22:34:00.100349 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100304 2577 scope.go:117] "RemoveContainer" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.100502 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100484 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} err="failed to get container status \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": rpc error: code = NotFound desc = could not find container \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": container with ID starting with 67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a not found: ID does not exist" Apr 24 22:34:00.100502 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100503 2577 scope.go:117] "RemoveContainer" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.100698 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100682 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} err="failed to get container status \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": rpc error: code = NotFound desc = could not find container \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": container with ID starting with 4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41 not found: ID does not exist" Apr 24 22:34:00.100790 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100698 2577 scope.go:117] "RemoveContainer" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.100942 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100926 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} err="failed to get container status \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": rpc error: code = NotFound desc = could not find container \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": container with ID starting with 9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf not found: ID does not exist" Apr 24 22:34:00.100981 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.100942 2577 scope.go:117] "RemoveContainer" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.101118 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.101102 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} err="failed to get container status \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": rpc error: code = NotFound desc = could not find container \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": container with ID starting with c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e not found: ID does not exist" Apr 24 22:34:00.101118 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.101117 2577 scope.go:117] "RemoveContainer" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.101768 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.101481 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} err="failed to get container status \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": rpc error: code = NotFound desc = could not find container \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": container with ID starting with 8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19 not found: ID does not exist" Apr 24 22:34:00.101768 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.101508 2577 scope.go:117] "RemoveContainer" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.101768 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.101718 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f"} err="failed to get container status \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": rpc error: code = NotFound desc = could not find container \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": container with ID starting with 6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f not found: ID does not exist" Apr 24 22:34:00.101768 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.101737 2577 scope.go:117] "RemoveContainer" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.102027 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102006 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} err="failed to get container status \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": rpc error: code = NotFound desc = could not find container \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": container with ID starting with fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c not found: ID does not exist" Apr 24 22:34:00.102090 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102029 2577 scope.go:117] "RemoveContainer" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.102340 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102314 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} err="failed to get container status \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": rpc error: code = NotFound desc = could not find container \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": container with ID starting with 67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a not found: ID does not exist" Apr 24 22:34:00.102340 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102339 2577 scope.go:117] "RemoveContainer" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.102620 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102596 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} err="failed to get container status \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": rpc error: code = NotFound desc = could not find container \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": container with ID starting with 4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41 not found: ID does not exist" Apr 24 22:34:00.102669 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102623 2577 scope.go:117] "RemoveContainer" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.102906 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102888 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} err="failed to get container status \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": rpc error: code = NotFound desc = could not find container \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": container with ID starting with 9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf not found: ID does not exist" Apr 24 22:34:00.102968 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.102907 2577 scope.go:117] "RemoveContainer" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.103171 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103146 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} err="failed to get container status \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": rpc error: code = NotFound desc = could not find container \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": container with ID starting with c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e not found: ID does not exist" Apr 24 22:34:00.103233 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103174 2577 scope.go:117] "RemoveContainer" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.103433 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103416 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:34:00.103514 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103441 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} err="failed to get container status \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": rpc error: code = NotFound desc = could not find container \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": container with ID starting with 8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19 not found: ID does not exist" Apr 24 22:34:00.103514 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103456 2577 scope.go:117] "RemoveContainer" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.103740 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103700 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f"} err="failed to get container status \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": rpc error: code = NotFound desc = could not find container \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": container with ID starting with 6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f not found: ID does not exist" Apr 24 22:34:00.103740 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103719 2577 scope.go:117] "RemoveContainer" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.103740 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103741 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="init-config-reloader" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103774 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="init-config-reloader" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103786 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="thanos-sidecar" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103792 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="thanos-sidecar" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103815 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="config-reloader" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103821 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="config-reloader" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103830 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103835 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103843 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-thanos" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103850 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-thanos" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103857 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-web" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103863 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-web" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103869 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="prometheus" Apr 24 22:34:00.103879 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103875 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="prometheus" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103938 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="config-reloader" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103951 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="thanos-sidecar" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103961 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-thanos" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103971 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="prometheus" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103982 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103980 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} err="failed to get container status \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": rpc error: code = NotFound desc = could not find container \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": container with ID starting with fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c not found: ID does not exist" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104002 2577 scope.go:117] "RemoveContainer" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.103992 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" containerName="kube-rbac-proxy-web" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104200 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} err="failed to get container status \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": rpc error: code = NotFound desc = could not find container \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": container with ID starting with 67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a not found: ID does not exist" Apr 24 22:34:00.104308 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104220 2577 scope.go:117] "RemoveContainer" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.104692 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104407 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} err="failed to get container status \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": rpc error: code = NotFound desc = could not find container \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": container with ID starting with 4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41 not found: ID does not exist" Apr 24 22:34:00.104692 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104422 2577 scope.go:117] "RemoveContainer" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.104692 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104643 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} err="failed to get container status \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": rpc error: code = NotFound desc = could not find container \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": container with ID starting with 9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf not found: ID does not exist" Apr 24 22:34:00.104692 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104664 2577 scope.go:117] "RemoveContainer" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.104941 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104911 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} err="failed to get container status \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": rpc error: code = NotFound desc = could not find container \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": container with ID starting with c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e not found: ID does not exist" Apr 24 22:34:00.105001 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.104943 2577 scope.go:117] "RemoveContainer" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.105175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105159 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} err="failed to get container status \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": rpc error: code = NotFound desc = could not find container \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": container with ID starting with 8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19 not found: ID does not exist" Apr 24 22:34:00.105243 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105177 2577 scope.go:117] "RemoveContainer" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.105379 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105358 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f"} err="failed to get container status \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": rpc error: code = NotFound desc = could not find container \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": container with ID starting with 6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f not found: ID does not exist" Apr 24 22:34:00.105430 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105382 2577 scope.go:117] "RemoveContainer" containerID="fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c" Apr 24 22:34:00.105616 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105600 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c"} err="failed to get container status \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": rpc error: code = NotFound desc = could not find container \"fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c\": container with ID starting with fb3a72b9986cf527e0639e959c5d10f57fb730e0f31a3564d223254acd1ff38c not found: ID does not exist" Apr 24 22:34:00.105665 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105616 2577 scope.go:117] "RemoveContainer" containerID="67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a" Apr 24 22:34:00.105853 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105835 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a"} err="failed to get container status \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": rpc error: code = NotFound desc = could not find container \"67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a\": container with ID starting with 67a18cbae335822d3fd957b0dec2287f0b9b60ef615576b6ed6803467616933a not found: ID does not exist" Apr 24 22:34:00.105894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.105854 2577 scope.go:117] "RemoveContainer" containerID="4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41" Apr 24 22:34:00.106072 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106054 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41"} err="failed to get container status \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": rpc error: code = NotFound desc = could not find container \"4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41\": container with ID starting with 4ce0567e9d56abe579e5d9aa8830a889f398a5b9c5adf6ebc2b5f8de32666d41 not found: ID does not exist" Apr 24 22:34:00.106110 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106074 2577 scope.go:117] "RemoveContainer" containerID="9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf" Apr 24 22:34:00.106259 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106244 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf"} err="failed to get container status \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": rpc error: code = NotFound desc = could not find container \"9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf\": container with ID starting with 9765f62f82bd464149298e79caf4f4e129c7af6ddbfce03bc75e9102026526bf not found: ID does not exist" Apr 24 22:34:00.106300 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106259 2577 scope.go:117] "RemoveContainer" containerID="c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e" Apr 24 22:34:00.106454 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106437 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e"} err="failed to get container status \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": rpc error: code = NotFound desc = could not find container \"c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e\": container with ID starting with c72f7aa13984a40ccc25b34d6cfeefa5d22591af3f5bf4c8e26ac7ae159a6e8e not found: ID does not exist" Apr 24 22:34:00.106497 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106456 2577 scope.go:117] "RemoveContainer" containerID="8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19" Apr 24 22:34:00.106662 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106645 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19"} err="failed to get container status \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": rpc error: code = NotFound desc = could not find container \"8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19\": container with ID starting with 8ac45a67df86c95d7217c1b1b8011022ff6e3d0404f69bb0de170904b68c8f19 not found: ID does not exist" Apr 24 22:34:00.106711 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106662 2577 scope.go:117] "RemoveContainer" containerID="6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f" Apr 24 22:34:00.106915 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.106896 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f"} err="failed to get container status \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": rpc error: code = NotFound desc = could not find container \"6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f\": container with ID starting with 6c5217f499116f9236b18b8944e778825203060c9cc2ad1ae6d89902679e025f not found: ID does not exist" Apr 24 22:34:00.109391 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.109377 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.111815 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.111798 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 22:34:00.112079 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 22:34:00.112141 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112076 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2ie52qm24gs8r\"" Apr 24 22:34:00.112219 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112201 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 22:34:00.112271 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112209 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 22:34:00.112271 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112209 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 22:34:00.112356 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112319 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 22:34:00.112575 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 22:34:00.112575 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112567 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v7cxk\"" Apr 24 22:34:00.112684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112592 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 22:34:00.112684 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.112655 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 22:34:00.113067 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.113046 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 22:34:00.115682 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.115663 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 22:34:00.117907 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.117880 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 22:34:00.119793 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.119767 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165780 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-config-out\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.165975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-config\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stb9v\" (UniqueName: \"kubernetes.io/projected/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-kube-api-access-stb9v\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.169672 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-web-config\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.170254 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.170254 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.166344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267126 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-config-out\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267126 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267126 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-config\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stb9v\" (UniqueName: \"kubernetes.io/projected/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-kube-api-access-stb9v\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-web-config\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.267645 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.268494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.268494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.267972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.268494 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.268330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.270461 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.269358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.270461 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.270215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.270461 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.270341 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.271475 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.270941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.271475 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.271373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.271475 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.271431 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.271688 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.271560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-config-out\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.272472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.272081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-web-config\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.272472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.272306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-config\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.272472 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.272377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.272695 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.272571 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.272772 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.272715 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.273169 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.273147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.274476 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.274456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.274846 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.274829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.277156 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.277139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stb9v\" (UniqueName: \"kubernetes.io/projected/b9a83bad-fcd4-4402-8a3a-3be65bfc7b87-kube-api-access-stb9v\") pod \"prometheus-k8s-0\" (UID: \"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.421510 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.421473 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:34:00.550820 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:00.550793 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 22:34:00.553383 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:34:00.553354 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a83bad_fcd4_4402_8a3a_3be65bfc7b87.slice/crio-9a88534b0c26f9aaaba8bc5e3245ada5de504043ccafa7c3cc314e8c6645583b WatchSource:0}: Error finding container 9a88534b0c26f9aaaba8bc5e3245ada5de504043ccafa7c3cc314e8c6645583b: Status 404 returned error can't find the container with id 9a88534b0c26f9aaaba8bc5e3245ada5de504043ccafa7c3cc314e8c6645583b Apr 24 22:34:01.052832 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:01.052789 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9a83bad-fcd4-4402-8a3a-3be65bfc7b87" containerID="2bb67c87d53e100ade1f32fc0e01aadbb32cf65bd78a1e85d13072208a780fbf" exitCode=0 Apr 24 22:34:01.052832 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:01.052833 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerDied","Data":"2bb67c87d53e100ade1f32fc0e01aadbb32cf65bd78a1e85d13072208a780fbf"} Apr 24 22:34:01.053045 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:01.052852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerStarted","Data":"9a88534b0c26f9aaaba8bc5e3245ada5de504043ccafa7c3cc314e8c6645583b"} Apr 24 22:34:01.269989 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:01.269948 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f11f4a-5570-4741-8e3f-dcb33449fcc2" path="/var/lib/kubelet/pods/c2f11f4a-5570-4741-8e3f-dcb33449fcc2/volumes" Apr 24 22:34:02.059260 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:02.059222 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerStarted","Data":"db48e67c7d9859c244565cd37100df4e36d140ef12bfd23688f10c671dcc009f"} Apr 24 22:34:02.059405 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:02.059266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerStarted","Data":"e083098101c9d18ffdb41ed87c5783204d518b2c40f0577d08b82f48c92b44cd"} Apr 24 22:34:02.059405 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:02.059280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerStarted","Data":"17c53cfdfcda7ff8775c691949137e4fedc585df82c0bef9f4efecfec73a1174"} Apr 24 22:34:02.059405 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:02.059291 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerStarted","Data":"b38ac21122f36940dfcfa4b7fccd188bedd46eb7f772196a56de3c548825a77b"} Apr 24 22:34:02.059405 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:02.059303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerStarted","Data":"ae8259a2c5db16cdc5f2c10a1760639d87723ce79b9bb94e93c1ac9e3b61e0d2"} Apr 24 22:34:02.059405 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:02.059315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9a83bad-fcd4-4402-8a3a-3be65bfc7b87","Type":"ContainerStarted","Data":"866c9adc2a7b345daa1cf4d9f670b6f6f619a3c6cfcebb7b5455183a8526aa6e"} Apr 24 22:34:02.088822 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:02.088738 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.088720374 podStartE2EDuration="2.088720374s" podCreationTimestamp="2026-04-24 22:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:34:02.086271846 +0000 UTC m=+229.381618086" watchObservedRunningTime="2026-04-24 22:34:02.088720374 +0000 UTC m=+229.384066615" Apr 24 22:34:05.422482 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:34:05.422443 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:35:00.422241 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:35:00.422204 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:35:00.437354 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:35:00.437316 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:35:01.248680 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:35:01.248652 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 22:35:13.152216 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:35:13.152186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:35:13.152663 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:35:13.152404 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:35:13.157924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:35:13.157899 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:36:26.856154 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.856066 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tt6mr"] Apr 24 22:36:26.859318 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.859302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:26.862024 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.862006 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 24 22:36:26.862104 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.862088 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 24 22:36:26.863361 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.863346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-fp4fp\"" Apr 24 22:36:26.868135 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.868111 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tt6mr"] Apr 24 22:36:26.896018 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.895971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e85ab9b7-56d0-42e2-8fc0-fc74084c48fb-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tt6mr\" (UID: \"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:26.896018 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.896015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfc9z\" (UniqueName: \"kubernetes.io/projected/e85ab9b7-56d0-42e2-8fc0-fc74084c48fb-kube-api-access-lfc9z\") pod \"cert-manager-webhook-587ccfb98-tt6mr\" (UID: \"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:26.997318 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.997268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e85ab9b7-56d0-42e2-8fc0-fc74084c48fb-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tt6mr\" (UID: \"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:26.997318 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:26.997327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfc9z\" (UniqueName: \"kubernetes.io/projected/e85ab9b7-56d0-42e2-8fc0-fc74084c48fb-kube-api-access-lfc9z\") pod \"cert-manager-webhook-587ccfb98-tt6mr\" (UID: \"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:27.006003 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:27.005970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfc9z\" (UniqueName: \"kubernetes.io/projected/e85ab9b7-56d0-42e2-8fc0-fc74084c48fb-kube-api-access-lfc9z\") pod \"cert-manager-webhook-587ccfb98-tt6mr\" (UID: \"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:27.006141 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:27.006120 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e85ab9b7-56d0-42e2-8fc0-fc74084c48fb-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-tt6mr\" (UID: \"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb\") " pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:27.178479 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:27.178451 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:27.301955 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:27.301926 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-tt6mr"] Apr 24 22:36:27.304815 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:36:27.304785 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85ab9b7_56d0_42e2_8fc0_fc74084c48fb.slice/crio-28363ae1bbe7ecf2bf9db7f34f5d8bf155880f1544db00d85c71906a2c3bfb81 WatchSource:0}: Error finding container 28363ae1bbe7ecf2bf9db7f34f5d8bf155880f1544db00d85c71906a2c3bfb81: Status 404 returned error can't find the container with id 28363ae1bbe7ecf2bf9db7f34f5d8bf155880f1544db00d85c71906a2c3bfb81 Apr 24 22:36:27.307069 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:27.307051 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:36:27.474423 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:27.474333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" event={"ID":"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb","Type":"ContainerStarted","Data":"28363ae1bbe7ecf2bf9db7f34f5d8bf155880f1544db00d85c71906a2c3bfb81"} Apr 24 22:36:30.486475 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:30.486442 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" event={"ID":"e85ab9b7-56d0-42e2-8fc0-fc74084c48fb","Type":"ContainerStarted","Data":"fac9a6bcb5e9ee21b13a143c555849a12d06621ca527a76f5dc619bceb938eac"} Apr 24 22:36:30.486838 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:30.486500 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:30.503536 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:30.503485 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" podStartSLOduration=1.410020354 podStartE2EDuration="4.50347163s" podCreationTimestamp="2026-04-24 22:36:26 +0000 UTC" firstStartedPulling="2026-04-24 22:36:27.307207622 +0000 UTC m=+374.602553840" lastFinishedPulling="2026-04-24 22:36:30.400658885 +0000 UTC m=+377.696005116" observedRunningTime="2026-04-24 22:36:30.501562866 +0000 UTC m=+377.796909110" watchObservedRunningTime="2026-04-24 22:36:30.50347163 +0000 UTC m=+377.798817869" Apr 24 22:36:32.774183 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.774150 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-5skwq"] Apr 24 22:36:32.777379 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.777363 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:32.780448 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.780429 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-2m75x\"" Apr 24 22:36:32.786370 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.786349 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-5skwq"] Apr 24 22:36:32.856920 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.856893 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtl4\" (UniqueName: \"kubernetes.io/projected/ca8aeee8-dc39-4ccd-88e2-bede3984a059-kube-api-access-mqtl4\") pod \"cert-manager-cainjector-68b757865b-5skwq\" (UID: \"ca8aeee8-dc39-4ccd-88e2-bede3984a059\") " pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:32.857070 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.856947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca8aeee8-dc39-4ccd-88e2-bede3984a059-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-5skwq\" (UID: \"ca8aeee8-dc39-4ccd-88e2-bede3984a059\") " pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:32.958181 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.958150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtl4\" (UniqueName: \"kubernetes.io/projected/ca8aeee8-dc39-4ccd-88e2-bede3984a059-kube-api-access-mqtl4\") pod \"cert-manager-cainjector-68b757865b-5skwq\" (UID: \"ca8aeee8-dc39-4ccd-88e2-bede3984a059\") " pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:32.958338 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.958220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca8aeee8-dc39-4ccd-88e2-bede3984a059-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-5skwq\" (UID: \"ca8aeee8-dc39-4ccd-88e2-bede3984a059\") " pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:32.970375 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.970344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca8aeee8-dc39-4ccd-88e2-bede3984a059-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-5skwq\" (UID: \"ca8aeee8-dc39-4ccd-88e2-bede3984a059\") " pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:32.970496 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:32.970422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtl4\" (UniqueName: \"kubernetes.io/projected/ca8aeee8-dc39-4ccd-88e2-bede3984a059-kube-api-access-mqtl4\") pod \"cert-manager-cainjector-68b757865b-5skwq\" (UID: \"ca8aeee8-dc39-4ccd-88e2-bede3984a059\") " pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:33.087046 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:33.086958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" Apr 24 22:36:33.209566 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:33.209522 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-5skwq"] Apr 24 22:36:33.212374 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:36:33.212353 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca8aeee8_dc39_4ccd_88e2_bede3984a059.slice/crio-dca1a30865243f6dfc0085a093f5d9e1a2228cf41b1ae393e3a71d20ad6ab747 WatchSource:0}: Error finding container dca1a30865243f6dfc0085a093f5d9e1a2228cf41b1ae393e3a71d20ad6ab747: Status 404 returned error can't find the container with id dca1a30865243f6dfc0085a093f5d9e1a2228cf41b1ae393e3a71d20ad6ab747 Apr 24 22:36:33.498517 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:33.498484 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" event={"ID":"ca8aeee8-dc39-4ccd-88e2-bede3984a059","Type":"ContainerStarted","Data":"2d1b3b056370f0a3541dc85d2b142ecca86a3e48bea77186e15d3b34b30e2cd5"} Apr 24 22:36:33.498517 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:33.498526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" event={"ID":"ca8aeee8-dc39-4ccd-88e2-bede3984a059","Type":"ContainerStarted","Data":"dca1a30865243f6dfc0085a093f5d9e1a2228cf41b1ae393e3a71d20ad6ab747"} Apr 24 22:36:33.515646 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:33.515595 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-5skwq" podStartSLOduration=1.5155827739999999 podStartE2EDuration="1.515582774s" podCreationTimestamp="2026-04-24 22:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:36:33.514518756 +0000 UTC m=+380.809864997" watchObservedRunningTime="2026-04-24 22:36:33.515582774 +0000 UTC m=+380.810929014" Apr 24 22:36:36.491573 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:36.491546 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-tt6mr" Apr 24 22:36:57.907741 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.907701 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7"] Apr 24 22:36:57.911624 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.911599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:57.917209 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.917187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-jq65j\"" Apr 24 22:36:57.917789 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.917767 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 22:36:57.920284 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.920268 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 22:36:57.920666 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.920647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 22:36:57.922324 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.922310 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 22:36:57.923311 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.923297 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:36:57.928467 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:57.928448 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7"] Apr 24 22:36:58.067997 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.067957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvcfq\" (UniqueName: \"kubernetes.io/projected/fb826e95-be62-48a6-aea4-26e6f8d720fe-kube-api-access-gvcfq\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.068257 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.068011 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb826e95-be62-48a6-aea4-26e6f8d720fe-metrics-cert\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.068257 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.068115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb826e95-be62-48a6-aea4-26e6f8d720fe-cert\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.068257 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.068211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fb826e95-be62-48a6-aea4-26e6f8d720fe-manager-config\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.168915 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.168880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb826e95-be62-48a6-aea4-26e6f8d720fe-metrics-cert\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.169076 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.168942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb826e95-be62-48a6-aea4-26e6f8d720fe-cert\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.169076 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.169006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fb826e95-be62-48a6-aea4-26e6f8d720fe-manager-config\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.169076 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.169034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvcfq\" (UniqueName: \"kubernetes.io/projected/fb826e95-be62-48a6-aea4-26e6f8d720fe-kube-api-access-gvcfq\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.169712 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.169682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fb826e95-be62-48a6-aea4-26e6f8d720fe-manager-config\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.171669 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.171641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb826e95-be62-48a6-aea4-26e6f8d720fe-metrics-cert\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.171812 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.171740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb826e95-be62-48a6-aea4-26e6f8d720fe-cert\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.180004 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.179985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvcfq\" (UniqueName: \"kubernetes.io/projected/fb826e95-be62-48a6-aea4-26e6f8d720fe-kube-api-access-gvcfq\") pod \"lws-controller-manager-859c5c9fd7-9sgp7\" (UID: \"fb826e95-be62-48a6-aea4-26e6f8d720fe\") " pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.221520 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.221496 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:36:58.357728 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.357695 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7"] Apr 24 22:36:58.360842 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:36:58.360813 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb826e95_be62_48a6_aea4_26e6f8d720fe.slice/crio-3cf73c1321b3e0c528a9751fc9c51f6d4a7d5dbd214319baf59edfcf9467580d WatchSource:0}: Error finding container 3cf73c1321b3e0c528a9751fc9c51f6d4a7d5dbd214319baf59edfcf9467580d: Status 404 returned error can't find the container with id 3cf73c1321b3e0c528a9751fc9c51f6d4a7d5dbd214319baf59edfcf9467580d Apr 24 22:36:58.578586 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:36:58.578507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" event={"ID":"fb826e95-be62-48a6-aea4-26e6f8d720fe","Type":"ContainerStarted","Data":"3cf73c1321b3e0c528a9751fc9c51f6d4a7d5dbd214319baf59edfcf9467580d"} Apr 24 22:37:01.589627 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:01.589588 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" event={"ID":"fb826e95-be62-48a6-aea4-26e6f8d720fe","Type":"ContainerStarted","Data":"bfe0ee82a4414f7575dd132dedeb0f1fb6473fac75e0ca558240319af8e17f79"} Apr 24 22:37:01.590035 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:01.589712 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:37:01.607826 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:01.607775 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" podStartSLOduration=2.192710734 podStartE2EDuration="4.607734701s" podCreationTimestamp="2026-04-24 22:36:57 +0000 UTC" firstStartedPulling="2026-04-24 22:36:58.362838417 +0000 UTC m=+405.658184636" lastFinishedPulling="2026-04-24 22:37:00.777862386 +0000 UTC m=+408.073208603" observedRunningTime="2026-04-24 22:37:01.60697335 +0000 UTC m=+408.902319614" watchObservedRunningTime="2026-04-24 22:37:01.607734701 +0000 UTC m=+408.903080937" Apr 24 22:37:12.595487 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:12.595458 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-859c5c9fd7-9sgp7" Apr 24 22:37:52.070432 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.070393 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf"] Apr 24 22:37:52.076935 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.076917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" Apr 24 22:37:52.079542 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.079525 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 22:37:52.080858 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.080842 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 22:37:52.080936 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.080847 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-lcfw2\"" Apr 24 22:37:52.087220 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.087194 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf"] Apr 24 22:37:52.218859 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.218819 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlp4s\" (UniqueName: \"kubernetes.io/projected/aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26-kube-api-access-dlp4s\") pod \"limitador-operator-controller-manager-c7fb4c8d5-45npf\" (UID: \"aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" Apr 24 22:37:52.320144 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.320105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlp4s\" (UniqueName: \"kubernetes.io/projected/aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26-kube-api-access-dlp4s\") pod \"limitador-operator-controller-manager-c7fb4c8d5-45npf\" (UID: \"aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" Apr 24 22:37:52.331208 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.331137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlp4s\" (UniqueName: \"kubernetes.io/projected/aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26-kube-api-access-dlp4s\") pod \"limitador-operator-controller-manager-c7fb4c8d5-45npf\" (UID: \"aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" Apr 24 22:37:52.387826 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.387797 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" Apr 24 22:37:52.532060 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.532028 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf"] Apr 24 22:37:52.535244 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:37:52.535205 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaecb9ce6_9585_4c15_8d4c_a4edfb9bfc26.slice/crio-edada11e62ebc9b3b114aa6945792d2e1ad205d309722b3240a29f31a2ab7149 WatchSource:0}: Error finding container edada11e62ebc9b3b114aa6945792d2e1ad205d309722b3240a29f31a2ab7149: Status 404 returned error can't find the container with id edada11e62ebc9b3b114aa6945792d2e1ad205d309722b3240a29f31a2ab7149 Apr 24 22:37:52.751379 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:52.751346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" event={"ID":"aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26","Type":"ContainerStarted","Data":"edada11e62ebc9b3b114aa6945792d2e1ad205d309722b3240a29f31a2ab7149"} Apr 24 22:37:55.765319 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:55.765276 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" event={"ID":"aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26","Type":"ContainerStarted","Data":"717ebcea378136593aeae998926a14f6a0f63890c633f1479d51126a0a167dab"} Apr 24 22:37:55.765721 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:55.765365 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" Apr 24 22:37:55.790911 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:37:55.790850 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" podStartSLOduration=1.140449057 podStartE2EDuration="3.790829892s" podCreationTimestamp="2026-04-24 22:37:52 +0000 UTC" firstStartedPulling="2026-04-24 22:37:52.537306618 +0000 UTC m=+459.832652837" lastFinishedPulling="2026-04-24 22:37:55.187687441 +0000 UTC m=+462.483033672" observedRunningTime="2026-04-24 22:37:55.79010571 +0000 UTC m=+463.085451962" watchObservedRunningTime="2026-04-24 22:37:55.790829892 +0000 UTC m=+463.086176136" Apr 24 22:38:06.770199 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:06.770169 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-45npf" Apr 24 22:38:37.471457 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.471416 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wftd"] Apr 24 22:38:37.479227 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.479205 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.481791 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.481766 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wftd"] Apr 24 22:38:37.481924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.481808 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hg677\"" Apr 24 22:38:37.481924 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.481817 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 22:38:37.561610 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.561567 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wftd"] Apr 24 22:38:37.589551 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.589525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c95bfa76-0768-4469-befd-ed4bc301180f-config-file\") pod \"limitador-limitador-64c8f475fb-6wftd\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.589685 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.589576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4fgr\" (UniqueName: \"kubernetes.io/projected/c95bfa76-0768-4469-befd-ed4bc301180f-kube-api-access-t4fgr\") pod \"limitador-limitador-64c8f475fb-6wftd\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.691065 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.691022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c95bfa76-0768-4469-befd-ed4bc301180f-config-file\") pod \"limitador-limitador-64c8f475fb-6wftd\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.691245 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.691096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4fgr\" (UniqueName: \"kubernetes.io/projected/c95bfa76-0768-4469-befd-ed4bc301180f-kube-api-access-t4fgr\") pod \"limitador-limitador-64c8f475fb-6wftd\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.691612 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.691592 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c95bfa76-0768-4469-befd-ed4bc301180f-config-file\") pod \"limitador-limitador-64c8f475fb-6wftd\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.701585 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.701557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4fgr\" (UniqueName: \"kubernetes.io/projected/c95bfa76-0768-4469-befd-ed4bc301180f-kube-api-access-t4fgr\") pod \"limitador-limitador-64c8f475fb-6wftd\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.790647 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.790575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:37.909604 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:37.909580 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wftd"] Apr 24 22:38:37.912443 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:38:37.912406 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc95bfa76_0768_4469_befd_ed4bc301180f.slice/crio-5cf60e36d13695df2ce2c18acc6ed34fe4fd8f483d995e46ca1fab00a6c266d1 WatchSource:0}: Error finding container 5cf60e36d13695df2ce2c18acc6ed34fe4fd8f483d995e46ca1fab00a6c266d1: Status 404 returned error can't find the container with id 5cf60e36d13695df2ce2c18acc6ed34fe4fd8f483d995e46ca1fab00a6c266d1 Apr 24 22:38:38.895817 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:38.895778 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" event={"ID":"c95bfa76-0768-4469-befd-ed4bc301180f","Type":"ContainerStarted","Data":"5cf60e36d13695df2ce2c18acc6ed34fe4fd8f483d995e46ca1fab00a6c266d1"} Apr 24 22:38:41.907972 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:41.907931 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" event={"ID":"c95bfa76-0768-4469-befd-ed4bc301180f","Type":"ContainerStarted","Data":"5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598"} Apr 24 22:38:41.908415 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:41.907995 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:38:41.925507 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:41.925453 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" podStartSLOduration=1.152666387 podStartE2EDuration="4.925438163s" podCreationTimestamp="2026-04-24 22:38:37 +0000 UTC" firstStartedPulling="2026-04-24 22:38:37.920577709 +0000 UTC m=+505.215923928" lastFinishedPulling="2026-04-24 22:38:41.693349487 +0000 UTC m=+508.988695704" observedRunningTime="2026-04-24 22:38:41.925137289 +0000 UTC m=+509.220483540" watchObservedRunningTime="2026-04-24 22:38:41.925438163 +0000 UTC m=+509.220784404" Apr 24 22:38:52.911791 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:38:52.911733 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:40:13.185175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:40:13.185141 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:40:13.186144 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:40:13.186118 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:43:37.778483 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:37.778391 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wftd"] Apr 24 22:43:37.779026 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:37.778634 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" podUID="c95bfa76-0768-4469-befd-ed4bc301180f" containerName="limitador" containerID="cri-o://5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598" gracePeriod=30 Apr 24 22:43:38.318160 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.318136 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:43:38.403814 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.403704 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4fgr\" (UniqueName: \"kubernetes.io/projected/c95bfa76-0768-4469-befd-ed4bc301180f-kube-api-access-t4fgr\") pod \"c95bfa76-0768-4469-befd-ed4bc301180f\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " Apr 24 22:43:38.403963 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.403816 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c95bfa76-0768-4469-befd-ed4bc301180f-config-file\") pod \"c95bfa76-0768-4469-befd-ed4bc301180f\" (UID: \"c95bfa76-0768-4469-befd-ed4bc301180f\") " Apr 24 22:43:38.404189 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.404150 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c95bfa76-0768-4469-befd-ed4bc301180f-config-file" (OuterVolumeSpecName: "config-file") pod "c95bfa76-0768-4469-befd-ed4bc301180f" (UID: "c95bfa76-0768-4469-befd-ed4bc301180f"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:43:38.406224 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.406196 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c95bfa76-0768-4469-befd-ed4bc301180f-kube-api-access-t4fgr" (OuterVolumeSpecName: "kube-api-access-t4fgr") pod "c95bfa76-0768-4469-befd-ed4bc301180f" (UID: "c95bfa76-0768-4469-befd-ed4bc301180f"). InnerVolumeSpecName "kube-api-access-t4fgr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:43:38.505002 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.504947 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4fgr\" (UniqueName: \"kubernetes.io/projected/c95bfa76-0768-4469-befd-ed4bc301180f-kube-api-access-t4fgr\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:43:38.505002 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.504996 2577 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c95bfa76-0768-4469-befd-ed4bc301180f-config-file\") on node \"ip-10-0-134-101.ec2.internal\" DevicePath \"\"" Apr 24 22:43:38.906403 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.906361 2577 generic.go:358] "Generic (PLEG): container finished" podID="c95bfa76-0768-4469-befd-ed4bc301180f" containerID="5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598" exitCode=0 Apr 24 22:43:38.906953 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.906446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" event={"ID":"c95bfa76-0768-4469-befd-ed4bc301180f","Type":"ContainerDied","Data":"5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598"} Apr 24 22:43:38.906953 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.906461 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" Apr 24 22:43:38.906953 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.906486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wftd" event={"ID":"c95bfa76-0768-4469-befd-ed4bc301180f","Type":"ContainerDied","Data":"5cf60e36d13695df2ce2c18acc6ed34fe4fd8f483d995e46ca1fab00a6c266d1"} Apr 24 22:43:38.906953 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.906506 2577 scope.go:117] "RemoveContainer" containerID="5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598" Apr 24 22:43:38.914858 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.914836 2577 scope.go:117] "RemoveContainer" containerID="5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598" Apr 24 22:43:38.915133 ip-10-0-134-101 kubenswrapper[2577]: E0424 22:43:38.915113 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598\": container with ID starting with 5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598 not found: ID does not exist" containerID="5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598" Apr 24 22:43:38.915200 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.915142 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598"} err="failed to get container status \"5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598\": rpc error: code = NotFound desc = could not find container \"5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598\": container with ID starting with 5e3c9bcecff95f26621bd17e236e99ec9588ef302ae45bcda999167aac966598 not found: ID does not exist" Apr 24 22:43:38.934475 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.934443 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wftd"] Apr 24 22:43:38.937129 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:38.937101 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wftd"] Apr 24 22:43:39.263688 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:39.263654 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c95bfa76-0768-4469-befd-ed4bc301180f" path="/var/lib/kubelet/pods/c95bfa76-0768-4469-befd-ed4bc301180f/volumes" Apr 24 22:43:53.027443 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.027412 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-8wr9b"] Apr 24 22:43:53.027815 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.027739 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c95bfa76-0768-4469-befd-ed4bc301180f" containerName="limitador" Apr 24 22:43:53.027815 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.027765 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95bfa76-0768-4469-befd-ed4bc301180f" containerName="limitador" Apr 24 22:43:53.027894 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.027837 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c95bfa76-0768-4469-befd-ed4bc301180f" containerName="limitador" Apr 24 22:43:53.032056 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.032036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.034826 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.034810 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hg677\"" Apr 24 22:43:53.035426 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.035410 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 22:43:53.043196 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.043174 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-8wr9b"] Apr 24 22:43:53.084431 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.084399 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-8wr9b"] Apr 24 22:43:53.132048 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.132012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttzf5\" (UniqueName: \"kubernetes.io/projected/1ff6494f-a74f-4838-b08d-e2a9f8a66395-kube-api-access-ttzf5\") pod \"limitador-limitador-64c8f475fb-8wr9b\" (UID: \"1ff6494f-a74f-4838-b08d-e2a9f8a66395\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.132223 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.132124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ff6494f-a74f-4838-b08d-e2a9f8a66395-config-file\") pod \"limitador-limitador-64c8f475fb-8wr9b\" (UID: \"1ff6494f-a74f-4838-b08d-e2a9f8a66395\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.233204 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.233168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ff6494f-a74f-4838-b08d-e2a9f8a66395-config-file\") pod \"limitador-limitador-64c8f475fb-8wr9b\" (UID: \"1ff6494f-a74f-4838-b08d-e2a9f8a66395\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.233379 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.233235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttzf5\" (UniqueName: \"kubernetes.io/projected/1ff6494f-a74f-4838-b08d-e2a9f8a66395-kube-api-access-ttzf5\") pod \"limitador-limitador-64c8f475fb-8wr9b\" (UID: \"1ff6494f-a74f-4838-b08d-e2a9f8a66395\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.233868 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.233838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ff6494f-a74f-4838-b08d-e2a9f8a66395-config-file\") pod \"limitador-limitador-64c8f475fb-8wr9b\" (UID: \"1ff6494f-a74f-4838-b08d-e2a9f8a66395\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.244177 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.244150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttzf5\" (UniqueName: \"kubernetes.io/projected/1ff6494f-a74f-4838-b08d-e2a9f8a66395-kube-api-access-ttzf5\") pod \"limitador-limitador-64c8f475fb-8wr9b\" (UID: \"1ff6494f-a74f-4838-b08d-e2a9f8a66395\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.343056 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.342964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.474898 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.474781 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-8wr9b"] Apr 24 22:43:53.477553 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:43:53.477525 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ff6494f_a74f_4838_b08d_e2a9f8a66395.slice/crio-3113b68b0e0ad67f0bd935f4252700108fcd9837cfdafabf7f5687d04c6161af WatchSource:0}: Error finding container 3113b68b0e0ad67f0bd935f4252700108fcd9837cfdafabf7f5687d04c6161af: Status 404 returned error can't find the container with id 3113b68b0e0ad67f0bd935f4252700108fcd9837cfdafabf7f5687d04c6161af Apr 24 22:43:53.479422 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.479406 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:43:53.960004 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.959973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" event={"ID":"1ff6494f-a74f-4838-b08d-e2a9f8a66395","Type":"ContainerStarted","Data":"fbbf3c087106a76c1b390f5eaeb4e1b8179d77a347ec327eff0e4d3c82e1d307"} Apr 24 22:43:53.960175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.960012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" event={"ID":"1ff6494f-a74f-4838-b08d-e2a9f8a66395","Type":"ContainerStarted","Data":"3113b68b0e0ad67f0bd935f4252700108fcd9837cfdafabf7f5687d04c6161af"} Apr 24 22:43:53.960175 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.960098 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:43:53.998567 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:43:53.998507 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" podStartSLOduration=0.998493611 podStartE2EDuration="998.493611ms" podCreationTimestamp="2026-04-24 22:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:43:53.997848437 +0000 UTC m=+821.293194689" watchObservedRunningTime="2026-04-24 22:43:53.998493611 +0000 UTC m=+821.293839851" Apr 24 22:44:04.964614 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:44:04.964584 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-8wr9b" Apr 24 22:45:13.216159 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:45:13.216083 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:45:13.216941 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:45:13.216920 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:49:18.545611 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:18.545582 2577 ???:1] "http: TLS handshake error from 10.0.129.176:45168: EOF" Apr 24 22:49:18.548089 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:18.545888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-g2bql_dc6ae891-bdaf-4a23-bf88-ea8e267d1795/global-pull-secret-syncer/0.log" Apr 24 22:49:18.655505 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:18.655473 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gm79c_0d65ad29-817e-44aa-888e-73e4eb91b70d/konnectivity-agent/0.log" Apr 24 22:49:18.834414 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:18.834326 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-101.ec2.internal_1067ddb593414b000288d5e34ffef8f7/haproxy/0.log" Apr 24 22:49:22.864208 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:22.864177 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-64c8f475fb-8wr9b_1ff6494f-a74f-4838-b08d-e2a9f8a66395/limitador/0.log" Apr 24 22:49:22.891056 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:22.891025 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-45npf_aecb9ce6-9585-4c15-8d4c-a4edfb9bfc26/manager/0.log" Apr 24 22:49:23.714729 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.714655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e94ba290-a213-4ae9-a127-b390467cc4c9/alertmanager/0.log" Apr 24 22:49:23.733548 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.733522 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e94ba290-a213-4ae9-a127-b390467cc4c9/config-reloader/0.log" Apr 24 22:49:23.751552 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.751533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e94ba290-a213-4ae9-a127-b390467cc4c9/kube-rbac-proxy-web/0.log" Apr 24 22:49:23.770554 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.770537 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e94ba290-a213-4ae9-a127-b390467cc4c9/kube-rbac-proxy/0.log" Apr 24 22:49:23.789162 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.789137 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e94ba290-a213-4ae9-a127-b390467cc4c9/kube-rbac-proxy-metric/0.log" Apr 24 22:49:23.808081 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.808059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e94ba290-a213-4ae9-a127-b390467cc4c9/prom-label-proxy/0.log" Apr 24 22:49:23.831515 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.831494 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e94ba290-a213-4ae9-a127-b390467cc4c9/init-config-reloader/0.log" Apr 24 22:49:23.971922 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:23.971850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fbf6988d5-lg6v2_e0e1c5a9-e806-4eb4-8d76-8f9aa1803e05/metrics-server/0.log" Apr 24 22:49:24.032651 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.032622 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jtkx_a56a8cc4-5a2f-4884-80eb-0c8d1dee9455/node-exporter/0.log" Apr 24 22:49:24.054052 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.054018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jtkx_a56a8cc4-5a2f-4884-80eb-0c8d1dee9455/kube-rbac-proxy/0.log" Apr 24 22:49:24.068738 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.068720 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jtkx_a56a8cc4-5a2f-4884-80eb-0c8d1dee9455/init-textfile/0.log" Apr 24 22:49:24.310696 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.310669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fkm5w_9f9e77c1-2948-4c92-9f23-a02949e0904b/kube-rbac-proxy-main/0.log" Apr 24 22:49:24.334242 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.334222 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fkm5w_9f9e77c1-2948-4c92-9f23-a02949e0904b/kube-rbac-proxy-self/0.log" Apr 24 22:49:24.350987 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.350964 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fkm5w_9f9e77c1-2948-4c92-9f23-a02949e0904b/openshift-state-metrics/0.log" Apr 24 22:49:24.390039 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.390011 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9a83bad-fcd4-4402-8a3a-3be65bfc7b87/prometheus/0.log" Apr 24 22:49:24.406084 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.406066 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9a83bad-fcd4-4402-8a3a-3be65bfc7b87/config-reloader/0.log" Apr 24 22:49:24.426136 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.426118 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9a83bad-fcd4-4402-8a3a-3be65bfc7b87/thanos-sidecar/0.log" Apr 24 22:49:24.445062 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.445040 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9a83bad-fcd4-4402-8a3a-3be65bfc7b87/kube-rbac-proxy-web/0.log" Apr 24 22:49:24.463567 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.463548 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9a83bad-fcd4-4402-8a3a-3be65bfc7b87/kube-rbac-proxy/0.log" Apr 24 22:49:24.481542 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.481526 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9a83bad-fcd4-4402-8a3a-3be65bfc7b87/kube-rbac-proxy-thanos/0.log" Apr 24 22:49:24.513096 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.513080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9a83bad-fcd4-4402-8a3a-3be65bfc7b87/init-config-reloader/0.log" Apr 24 22:49:24.699426 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.699386 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-646b8cbfc-fqdkv_0c14d682-7850-4b94-aeda-73452c4cca01/thanos-query/0.log" Apr 24 22:49:24.718173 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.718148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-646b8cbfc-fqdkv_0c14d682-7850-4b94-aeda-73452c4cca01/kube-rbac-proxy-web/0.log" Apr 24 22:49:24.736849 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.736824 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-646b8cbfc-fqdkv_0c14d682-7850-4b94-aeda-73452c4cca01/kube-rbac-proxy/0.log" Apr 24 22:49:24.755466 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.755446 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-646b8cbfc-fqdkv_0c14d682-7850-4b94-aeda-73452c4cca01/prom-label-proxy/0.log" Apr 24 22:49:24.774985 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.774965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-646b8cbfc-fqdkv_0c14d682-7850-4b94-aeda-73452c4cca01/kube-rbac-proxy-rules/0.log" Apr 24 22:49:24.793066 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:24.793046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-646b8cbfc-fqdkv_0c14d682-7850-4b94-aeda-73452c4cca01/kube-rbac-proxy-metrics/0.log" Apr 24 22:49:25.841574 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:25.841543 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wkmx9_5dc0ab6c-0ed6-435b-9203-3352b2d3e68e/networking-console-plugin/0.log" Apr 24 22:49:26.748603 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:26.748574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-4rldg_d71288e3-0a52-4c68-9d3b-643365a30e02/download-server/0.log" Apr 24 22:49:27.071576 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.071463 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf"] Apr 24 22:49:27.074850 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.074831 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.077452 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.077436 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lfh5\"/\"kube-root-ca.crt\"" Apr 24 22:49:27.077548 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.077526 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5lfh5\"/\"default-dockercfg-p7s87\"" Apr 24 22:49:27.077647 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.077631 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lfh5\"/\"openshift-service-ca.crt\"" Apr 24 22:49:27.083455 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.083435 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf"] Apr 24 22:49:27.172678 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.172647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-sys\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.172678 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.172680 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jmq5\" (UniqueName: \"kubernetes.io/projected/a0f24944-1009-4d9e-9d28-9a13bc4d808b-kube-api-access-7jmq5\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.172889 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.172713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-podres\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.172889 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.172825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-proc\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.172889 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.172882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-lib-modules\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274196 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-lib-modules\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274355 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-sys\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274355 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jmq5\" (UniqueName: \"kubernetes.io/projected/a0f24944-1009-4d9e-9d28-9a13bc4d808b-kube-api-access-7jmq5\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274355 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-podres\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274355 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274295 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-sys\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274355 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-lib-modules\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274355 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-proc\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274572 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274388 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-podres\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.274572 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.274387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a0f24944-1009-4d9e-9d28-9a13bc4d808b-proc\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.282707 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.282684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jmq5\" (UniqueName: \"kubernetes.io/projected/a0f24944-1009-4d9e-9d28-9a13bc4d808b-kube-api-access-7jmq5\") pod \"perf-node-gather-daemonset-v9snf\" (UID: \"a0f24944-1009-4d9e-9d28-9a13bc4d808b\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.385183 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.385121 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:27.506610 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.506585 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf"] Apr 24 22:49:27.510017 ip-10-0-134-101 kubenswrapper[2577]: W0424 22:49:27.509984 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda0f24944_1009_4d9e_9d28_9a13bc4d808b.slice/crio-1bf88277b6758ec3facd40ef025ae3e2131ccbf98f90a9618d79a40231c75365 WatchSource:0}: Error finding container 1bf88277b6758ec3facd40ef025ae3e2131ccbf98f90a9618d79a40231c75365: Status 404 returned error can't find the container with id 1bf88277b6758ec3facd40ef025ae3e2131ccbf98f90a9618d79a40231c75365 Apr 24 22:49:27.511572 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.511557 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:49:27.953992 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.953966 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fl5t5_0fda03dc-9085-4272-aa29-583404383acf/dns/0.log" Apr 24 22:49:27.972767 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:27.972725 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fl5t5_0fda03dc-9085-4272-aa29-583404383acf/kube-rbac-proxy/0.log" Apr 24 22:49:28.042095 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:28.042072 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4v6p7_92fe7641-e2dc-499a-a25d-09cdcdac368b/dns-node-resolver/0.log" Apr 24 22:49:28.071278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:28.071249 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" event={"ID":"a0f24944-1009-4d9e-9d28-9a13bc4d808b","Type":"ContainerStarted","Data":"8433a6b52406b059ec3b7aa53803413c56861c9f9a87deeaff9d8d8dfc57c3a5"} Apr 24 22:49:28.071278 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:28.071284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" event={"ID":"a0f24944-1009-4d9e-9d28-9a13bc4d808b","Type":"ContainerStarted","Data":"1bf88277b6758ec3facd40ef025ae3e2131ccbf98f90a9618d79a40231c75365"} Apr 24 22:49:28.071460 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:28.071332 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:28.088728 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:28.088681 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" podStartSLOduration=1.088669677 podStartE2EDuration="1.088669677s" podCreationTimestamp="2026-04-24 22:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:49:28.086658668 +0000 UTC m=+1155.382004917" watchObservedRunningTime="2026-04-24 22:49:28.088669677 +0000 UTC m=+1155.384015916" Apr 24 22:49:28.538693 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:28.538663 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rp5x6_9ea69dcf-def3-4c54-b774-dad54e40bced/node-ca/0.log" Apr 24 22:49:29.717206 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:29.717176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-48kr8_7cda4d53-f962-44ab-a661-325196e9edf2/serve-healthcheck-canary/0.log" Apr 24 22:49:30.152145 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:30.152070 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lj9nb_c2102ed1-69af-409e-b44a-dad1845de530/insights-operator/0.log" Apr 24 22:49:30.152840 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:30.152826 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lj9nb_c2102ed1-69af-409e-b44a-dad1845de530/insights-operator/1.log" Apr 24 22:49:30.233275 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:30.233247 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mdrlq_0c309811-846a-4c71-843a-c45c3f281cce/kube-rbac-proxy/0.log" Apr 24 22:49:30.252851 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:30.252831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mdrlq_0c309811-846a-4c71-843a-c45c3f281cce/exporter/0.log" Apr 24 22:49:30.273966 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:30.273941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mdrlq_0c309811-846a-4c71-843a-c45c3f281cce/extractor/0.log" Apr 24 22:49:32.364504 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:32.364472 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-859c5c9fd7-9sgp7_fb826e95-be62-48a6-aea4-26e6f8d720fe/manager/0.log" Apr 24 22:49:34.084846 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:34.084820 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-v9snf" Apr 24 22:49:35.759406 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:35.759379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gbl9k_32d08a62-7ae6-4083-b119-caaead1033c8/migrator/0.log" Apr 24 22:49:35.777900 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:35.777866 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gbl9k_32d08a62-7ae6-4083-b119-caaead1033c8/graceful-termination/0.log" Apr 24 22:49:37.421028 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.421000 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8lg2_4264f6dc-3224-4773-b9ba-8ad8e185093f/kube-multus-additional-cni-plugins/0.log" Apr 24 22:49:37.473791 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.473761 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8lg2_4264f6dc-3224-4773-b9ba-8ad8e185093f/egress-router-binary-copy/0.log" Apr 24 22:49:37.492098 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.492073 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8lg2_4264f6dc-3224-4773-b9ba-8ad8e185093f/cni-plugins/0.log" Apr 24 22:49:37.510042 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.510018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8lg2_4264f6dc-3224-4773-b9ba-8ad8e185093f/bond-cni-plugin/0.log" Apr 24 22:49:37.530337 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.530319 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8lg2_4264f6dc-3224-4773-b9ba-8ad8e185093f/routeoverride-cni/0.log" Apr 24 22:49:37.549074 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.549056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8lg2_4264f6dc-3224-4773-b9ba-8ad8e185093f/whereabouts-cni-bincopy/0.log" Apr 24 22:49:37.570140 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.570123 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8lg2_4264f6dc-3224-4773-b9ba-8ad8e185093f/whereabouts-cni/0.log" Apr 24 22:49:37.759931 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.759857 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kbktf_429121c4-84dd-4766-8b1d-bdc0550cffd5/kube-multus/0.log" Apr 24 22:49:37.779512 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.779481 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l8hdc_0466857c-9575-492e-9148-290f37031549/network-metrics-daemon/0.log" Apr 24 22:49:37.795951 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:37.795931 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l8hdc_0466857c-9575-492e-9148-290f37031549/kube-rbac-proxy/0.log" Apr 24 22:49:38.970473 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:38.970444 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-controller/0.log" Apr 24 22:49:38.988008 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:38.987983 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/0.log" Apr 24 22:49:38.993265 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:38.993248 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovn-acl-logging/1.log" Apr 24 22:49:39.010820 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:39.010792 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/kube-rbac-proxy-node/0.log" Apr 24 22:49:39.031320 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:39.031289 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:49:39.046872 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:39.046852 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/northd/0.log" Apr 24 22:49:39.064703 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:39.064686 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/nbdb/0.log" Apr 24 22:49:39.083577 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:39.083559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/sbdb/0.log" Apr 24 22:49:39.187137 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:39.187111 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbfmd_a41baed3-bda8-4fa8-a15e-3fdd5f955169/ovnkube-controller/0.log" Apr 24 22:49:40.575821 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:40.575790 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2jhbl_9b1dcaeb-db0d-424b-9f57-c60a54511aa1/network-check-target-container/0.log" Apr 24 22:49:41.525104 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:41.525070 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-j48ng_a00368dd-8532-419e-919b-66cb1cf3e0c9/iptables-alerter/0.log" Apr 24 22:49:42.112465 ip-10-0-134-101 kubenswrapper[2577]: I0424 22:49:42.112435 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lk7ql_a3b8675c-f9be-454b-aba6-002550b72a84/tuned/0.log"